Improve company productivity with a Business Account.Sign Up

x
  • Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 1467
  • Last Modified:

PrintWriter memory consumption

Hi,
I have a java program with 20 threads.  Each tread uses a different PrintWriter that is already open to write to a seperate text file.  The problem is that I get a java.lang.OutOfMemory error after a certain period of time.  I was wondering if anyone could tell me if PrintWriter needs to use more memory as the size of the text file increases.  If so is there anyway to perform these write operations without incurring an out of memory error, or does the PrintWriter strain on memory remain static regardless of the size of the text file.
         Cheers,
         everton690  
0
everton690
Asked:
everton690
  • 8
  • 7
  • 5
  • +3
1 Solution
 
CEHJCommented:
Your problem is more likely to be caused by not closing Writers or other resources after use.
0
 
everton690Author Commented:
The thing is that the Writers must remain open to continually allow data to be written to the text files.
0
 
mmuruganandamCommented:
Seems like all your contents are in memory.  It is not going to the file ... i suppose.

If it is in memory, when the text size grows, obviously, there would be out of memory exception.


Regards,
Muruga
0
Free Tool: Site Down Detector

Helpful to verify reports of your own downtime, or to double check a downed website you are trying to access.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

 
mmuruganandamCommented:
Can you post your code
0
 
CEHJCommented:
Tell us more about what's happening. So far assuming 20 PW's and one text file. How big is the text file?
0
 
everton690Author Commented:
There are 20 PW's,20 text files and 20 threads.  Each thread calls a function that uses a PW that is already open to write to the text file.  The program then calls another function that reads from this text file.
0
 
CEHJCommented:
Sounds to me like you should be reading from and writing to buffers at this point - not files (at least until all processing is finished)
0
 
mmuruganandamCommented:
Let me tell you one thing, if all the contents are going into the file, then there would be very less chance of outofmemory exception.

That portion of file would help us finding the exact cause.
0
 
Mayank SAssociate Director - Product EngineeringCommented:
The JVM does not use all the available memory. You can specify how much memory you want it to use, by the -Xmx option. Like:

-Xmx200m

- tells the JVM to use 200 MB
0
 
Mayank SAssociate Director - Product EngineeringCommented:
I would still agree with CEHJ that you should use buffers instead of writing directly to files. You can try BufferedWriter.
0
 
Mayank SAssociate Director - Product EngineeringCommented:
Ah, I thought CEHJ told you to use buffers. I guess he meant that you were *using* buffers. Well, which side are you?
0
 
CEHJCommented:
>>I would still agree with CEHJ ...

Difficult to tell exactly though without knowing more
0
 
Mayank SAssociate Director - Product EngineeringCommented:
Generally, I prefer using buffers. Disk I/O is slower. But well, if it gives OutOfMemoryException, I would eliminate the buffers ;-)
0
 
everton690Author Commented:
Here is an example of a thread and the read and write functions.  Hopefully this will help.  


 Thread thread = new Thread(new Runnable() {
      public void run() {
              
    writesomething(input param,nameo PrintWriter);
   
         
    while (program not finished)
    {
   
     
      Writer.flush();
     
      readfromfile(input param,input param);
      
               
     }
   
do some computation

     writetofileagain(input param, input param);
 
     
}
    });
    thread.start();
          

//function to write to file

writesomething(input param, nameofPrintWriter){

nameofPrintWriter.println(something);

}

//function to read from file

readfromfile(int inputparam,nameoffile){
      
FileReader fr=null;
String s=null;

String patt =" "+inputparam+" :";

     String file =
      System.getProperty("user.dir") +
      System.getProperty("file.separator") +nameoffile+".txt";
String str=null;


try {
        fr = new FileReader(file);
        LineNumberReader lr =
          new LineNumberReader(fr);
       
        while ((str = lr.readLine()) != null) {
            
          if (str.indexOf(patt) != -1) {

         s=str.substring(11);
      
       // lr.close();
        }
    }
  fr.close();
}
         
    catch (ArrayIndexOutOfBoundsException e) {
        System.err.println("Caught ArrayIndexOutOfBoundsException: " +e.getMessage());
                                       }

    catch (IOException e) {
        System.err.println("Caught IOException: for readsomething " + e.getMessage());
      
                          }
 
return s;
}


     
0
 
CEHJCommented:
What's the average file size?
0
 
Mayank SAssociate Director - Product EngineeringCommented:
Looks like you might have to try the -Xmx option.
0
 
Mayank SAssociate Director - Product EngineeringCommented:
Check the amount of free memory that you have. I guess that the JVM is not using all of it. Suppose its 200 MB, then use:

java -Xmx200m

You can also increase the maximum stack-size with -Xms
0
 
CEHJCommented:
Frankly i'm surprised that is working, (excepting the OutOfMemory) as if the Writers are not being closed, you might get an old copy of the file instead of the data from the most recent write. *Was* it working?

Anyway, those (almost certainly unnecessarily) tight loops will cause a massive allocation of resources that will eventually produce your OOM
0
 
everton690Author Commented:
I tried using the -Xmx option but it only delays the java.lang.OutOfMemory error.  The file sizes will continually grow until each one contains roughly 700,000 lines of data.  The reason the data is being sent to the text files and not stored in memory  is to reduce the strain on memory.  Do you think this method of using Writers and Readers is increasing the strain on memory as the file size increases or is the strain irrelevant of the file size?
0
 
mmuruganandamCommented:
My suspect is at LineNumberReader.  It will read the entire content.  

I am not seeing any use of LineNumberReader.  Replace that with BufferedReader.  That might help.
0
 
Mayank SAssociate Director - Product EngineeringCommented:
If it delays the OutOfMemory error, then it means that it is working to some extent - you are still using lesser memory. You need even more. I suggest that you check how much available memory you have on your system, and then give the entire as parameter to -Xmx. Try increasing the stack-size as well.
0
 
CEHJCommented:
I think OOM, given the code, would probably happen with much smaller files but the fact they're large or at least becoming large is certainly going to make things worse.

You should probably be doing this with your own buffers, but at the moment i can't think of a good way to implement it.
0
 
sciuriwareCommented:
I am using some 70 Printwriters, with buffering.

Search in another direction, it's definitely NOT the IO that causes out-of-memory!


;JOOP!
0
 
gnoonCommented:
>while (program not finished) {
so, when the program got finish?

Maybe, the infinite loop cause the OOM occur. Trying to print something in the loop and run only 1 thread to test it.

0
 
everton690Author Commented:

The cause of the memory leak was nothing to do with any of the IO.  I gave the points to sciuriware as he was right but to provide a solution would have required a lot more information

Cheers,
everton690  
0
 
Mayank SAssociate Director - Product EngineeringCommented:
Were you able to figure out that what was the actual cause?
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

Join & Write a Comment

Featured Post

Free Tool: IP Lookup

Get more info about an IP address or domain name, such as organization, abuse contacts and geolocation.

One of a set of tools we are providing to everyone as a way of saying thank you for being a part of the community.

  • 8
  • 7
  • 5
  • +3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now