Timing out on large file connections using URLConnection

I don't write in Java too often so this may seem a little basic.

I'm using URLConnection to scrape some sites. It works fine until it tries to scrape something like a large video. So I put a little function in to grab headers first, check for "text" in the content type, then continue from there.

This works fine for smaller items, it catches my trap for zip files and such that aren't that large to begin with... but I tested with this one really large MOV file, and it just hung there trying to connect (just for getting headers.)

I tried adding a setConnectTimeout and setReadTimeout at first 5000ms, 1000ms, then 100ms, but I must be using them incorrectly because they aren't canceling the connection ( even when i brought it down to 100 ).

Any help is appreciated, thanks.
// Getting headers (sans the function/catches etc)
u = new URL( $url );
uc = u.openConnection();
uc.setConnectTimeout( 1000 );
uc.setReadTimeout( 1000 );
header = uc.getContentType();
System.out.println( header );
return header;
 
// Get page data (sans other stuff)
String headers			=	getHeaders( $url );
if( headers.indexOf("text") < 0 ){
	System.out.print( "Bad headers : " + headers );
	return "";
}else{
	System.out.println( "Good headers, continue ");
}

Open in new window

LVL 4
MattKenefickAsked:
Who is Participating?
 
objectsConnect With a Mentor Commented:
you could send a HEAD request to just get the content type :)
0
 
objectsCommented:
you can also check the content length to determine the amount of data
there is no way to set a timeout on URLConnection, you would need to implement that yourself by for example timing out the thread that is pulling the file

HttpClient though would be better for handling it instead of urlconnection
0
 
MattKenefickAuthor Commented:
I thought of that, but I'm not sure if I can use HttpClient because I'm sandboxed. I thought of checking content-length but I figured that if it took that long to read the content-type header that it would be same for content-length since they're both headers. Does it read the content-length first?

And how are the timeouts supposed to be used? I assume itd be a time length til it cancels the connection, but apparently not? Unless im using it wrong. Any input on this? Thanks the assistance
0
Never miss a deadline with monday.com

The revolutionary project management tool is here!   Plan visually with a single glance and make sure your projects get done.

 
jwentingCommented:
HttpClient uses the same connections as does URLConnection, it just handles them for you.
0
 
MattKenefickAuthor Commented:
Okay, I suppose I can try rolling back to that but I'd still also like to know about content-type vs content-length and the reality behind what the timeouts are all about. Thanks jwenting!
0
 
objectsCommented:
> I thought of that, but I'm not sure if I can use HttpClient because I'm sandboxed.

sandbox does not affect what you can use

> I thought of checking content-length but I figured that if it took that long to read the content-type header that it would be same for content-length since they're both headers. Does it read the content-length first?

It doesn't need to read the content to access the headers.

> And how are the timeouts supposed to be used?

They are how long it will wait for server to respond


0
 
MattKenefickAuthor Commented:
@objects

I know headers are supposed to come before content. That's why I said content-type was taking a long time to load asif it was loading content, which is exactly the problem.

I understand the connect timeout should be until the connection is recognized, but isn't setReadTimeout how long for it to read the content? Like if its a 10mb video and your setReadTimeout is 2000ms, shouldn't it throw an error? Cause mine doesn't.
0
 
objectsCommented:
so are you not even reading the content? That doesn't sound right.

> but isn't setReadTimeout how long for it to read the content?

no how long it will wait for a read to respond
0
 
CEHJCommented:
This is ulimately a threading problem. Why?

A large file is going to have the same order of number of headers as a small one. The problem is also not to do with connect or read timeout. If it were, you'd be seeing exceptions.

You're probably experiencing other networking problems and you can certainly expect to get them from time to time. What shouldn't happen is for this to become a problem for your program, having to wait. That's the threading problem.

The fetch should be done in a separate thread so that your main thread can take the appropriate action when the worker thread is kept waiting around too long
0
 
MattKenefickAuthor Commented:
Good little chart about the differences between HTTPClient and URLConnection http://www.innovation.ch/java/HTTPClient/urlcon_vs_httpclient.html
0
 
objectsCommented:
was referring to apache httpclient

http://hc.apache.org/httpclient-3.x/
0
 
MattKenefickAuthor Commented:
used this to get the headers without timing out. thanks.
this.hurlc.setDoOutput(true);
        try{
        	this.hurlc.setRequestMethod("HEAD");
            returnString = this.hurlc.getContentType();
            this.hurlc.disconnect();
        } catch ( ProtocolException e ){
        	// ...
        } catch ( RuntimeException e ){
        	// ...
        }

Open in new window

0
 
MattKenefickAuthor Commented:
Not everything I wanted to know, but the majority of it.
0
 
MattKenefickAuthor Commented:
To elaborate a little further, I switched from URLConnection to HttpURLConnection. The code the creates the "hurlc" is like this:
this.hurlc = (HttpURLConnection) new URL("http://...com").openConnection();

Open in new window

0
All Courses

From novice to tech pro — start learning today.