I am trying to create a java application that will retrieve html source (from an html file uploaded by staff members) from a slow server. There are approx. 500 of these files and they are rather long because they store lots of information. I need to (a) figure out how to download each file into a string (each file into one element of a string array) and I need to do this as fast as possible (I know there are several ways to do this but I need the fastest method), and (b) figure out how to read the headers of all the files before I begin downloading so that I can show the download progress in %.
Also, if possible, I need to figure out a way to store the downloaded content into a microsoft access database.
The data is stored in the following manor: Firstname - Lastname - Date - Phone Number.
I can parse the content easily but I need a very fast and effective method for storing the data for faster access in the future. The files are updated everyweek, therefore i need to get the date of the first item in the database and compare it to the date in each line of the updated page so that i don't download unnecessary content.
I know this is a long solution so It will I have set the points to as much as I could. Also, since there are so many parts, I will most likely be distributing the points.
Thanks for your help, pop.