Hello all. I have beem working sort of half-heartedly on a screen scraper application that downloads a given web page, extracts, formats and presents data. I have been using the System.Net.WebClient Class to retrieve the web page like so:
Private Function GetPageHTML(ByVal URL As String) As String
' Retrieves the HTML from the specified URL
Dim objWC As New System.Net.WebClient
Return New System.Text.UTF8Encoding().GetString(objWC.DownloadData(URL))
It seems to work fairly well except if the page is unavailable or some other network error happens, the application freezes and pretty much never comes back or I get a vague pop-up telling me there was an unhandled exception. I had done some work in PHP with CURL and it had a timeout value that prevented this but a perusal of the docs does not reveal any such abilities with this class.
So, I'm looking for a way to make a web page retrieval time out and/or return an error if there's an issue.
My forte' is really VB 6.0 so please keep that in mind as you answer. I'm not very experienced in .NET yet but I'm learning to like it rapidly.
Setting this at 250 because I'm betting that this is an easy answer (if not we can negotiate) but I'm looking for an answer sort of quick before I don;t need it anymore. :-). Thanks for reading.