java output from a file

Hi guys,
I have a problem reading the results from a text file results.txt and outputting them to a text area in a gui,
I mean at the minute I'm using a FOR loop and a readline() method from the bufferedReader class ,results(below) is the text file  and this this is the working code :

try {

bufferedReader  = new BufferedReader(new FileReader(results));

            for(int n =0; n < 100; n++)
             
              {
                queryResult += bufferedReader.readLine();
       
              }
         }

         catch (IOException e) {}


        return queryResult;
    }
but this FOR loop gives me the first 100 lines of the text file and stops, I want to read until the end of file and then stop the ouput, queryResult takes the output from the file. I have been trying a while loop but have not got it working yet,
any help would be greatly welcomed,
thanks
Damo.
dpconnaghAsked:
Who is Participating?
 
Venci75Connect With a Mentor Commented:
try {
  bufferedReader  = new BufferedReader(new FileReader(results));
  String line
  while ((line=bufferedReader.readLine()) != null) {
    queryResult += line;
  }
} catch (IOException e) {}

return queryResult;
   
0
 
Peter KwanAnalyst ProgrammerCommented:
You can try this:

String line;
line = bufferedReader.readLine();

while (line != null) {
   queryResult += line;
   line = bufferedReader.readLine();
}
0
 
mzimmer74Commented:
Here is the one I personally use (as I don't like having an assignment in the while loop) but either of the two previous also work:

BufferedReader bufferedReader ......

while (bufferedReader.ready())
{
  queryResult += bufferedReader.readLine();
}
0
The 14th Annual Expert Award Winners

The results are in! Meet the top members of our 2017 Expert Awards. Congratulations to all who qualified!

 
Jim CakalicSenior Developer/ArchitectCommented:
When reading _all_ contents of a file, I usually do something like this:

    private static String readFully(String filename) throws Exception {
        File file = new File(filename);
        FileReader in = new FileReader(file);
        char[] buf = new char[(int)file.length()];
        in.read(buf);
        in.close();
        return new String(buf);
    }

This has two benefits. First, it grabs the entire file at once instead of in line-oriented chunks. This is more performant than using a BufferedReader.readLine which a) reads the file in buffer-size chunks meaning that there are more physical I/O requests, b) searches through the buffered data to find line terminators and c) constructs new String objects for each line that it returns.

Second, this technique _also_ preserves line terminators in the input. Each line returned from BufferedReader.readLine has the line terminators knocked off. You're probably gonna want these for intelligible display in the JTextArea. To obtain the equivalent, you'd have to add the line terminators back on like this:

    private static String readFully(String filename) throws Exception {
        String terminator = System.getProperty("line.separator");
        File file = new File(filename);
        BufferedReader in = new BufferedReader(new FileReader(file));
        StringBuffer buf = new StringBuffer((int)file.length());
        String line = null;

        while ((line = in.readLine()) != null) {
            buf.append(line).append(terminator);
        }
        in.close();
        return buf.toString();
    }

BTW, notice that I used a StringBuffer to build the content? Don't use String concatenation as you did in your post and as the others did in their comments. String concatenation has a huge amount of overhead from construction of temporary StringBuffers and char[] resizing and copying. On even small files, say 2K, concatenating into a StringBuffer is five times faster. By the time you hit a file size of 10K, you're looking at better than 20 times faster. Reading a 50K file is over 100 times faster. And I stopped timing when I got to a 100K file and the StringBuffer method was 750 times faster -- using String concatenation it took more than 20 seconds just to read the file one time. Reading directly into a char[] as I originally suggested is about 30% faster than using the BufferedReader to read lines regardless of the file size. (All timings were taken with JDK 1.3 on NT4.)

In short, read the file all at once if you can. And don't do concatenation of non-constant Strings unless you just don't care about performance.

Best regards,
Jim Cakalic
0
 
Venci75Commented:
No comment has been added lately, so it's time to clean up this TA.
I will leave a recommendation in the Cleanup topic area that this question is:
Answered by: Venci75
Please leave any comments here within the next seven days.
 
PLEASE DO NOT ACCEPT THIS COMMENT AS AN ANSWER!
 
Venci75
EE Cleanup Volunteer
0
 
Jim CakalicSenior Developer/ArchitectCommented:
I'll give you this one, although I think you'd be hard pressed to deny that my answer was technically superior. (No slight intended.)

Jim
0
 
Venci75Commented:
yes - it is probably a better solution, but dpconnagh actually changes the data while reading the file - it removes the new line character for the file content. Also - I was the first who suggested a solution. Anyway - I think you deserve points - so the best would be if the points are splitted. I didn't suggested this because of the cleanup recomendations.
0
 
Jim CakalicSenior Developer/ArchitectCommented:
Well, choosing not to debate the point, I only wanted to voice my opinion about the chosen answer. As I stated previously, I'm OK with the resolution.

Jim
0
 
SpideyModCommented:
per recommendation

SpideyMod
Community Support Moderator @Experts Exchange
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.