Loading really large tables times out

Posted on 2004-09-08
Last Modified: 2010-04-06

I have a website that has some really large tables. When trying to load these really large tables sometimes the brouser timesout. To see an example go to, scroll down to Cesium_137_unfil and click on 13335.

Is there anything I can do in my code to prevent this from happening? The end user will not always know how to change any settings.

Question by:lorikellyr
  • 4
LVL 53

Accepted Solution

COBOLdinosaur earned 125 total points
ID: 12008027
I don't know why you would think a transfer of that much data would not result in some timeouts.  The application is not appropriate for a web page.  If you actually have to transfer that much data you should be putting it in a compressed .zip format and download the files.

There are three factors at play here contributing to the problem.  The number one problem is the size of the download.  I have a very fast connection and the download time is unacceptable.  Anyone using anything but a high speed connection is going to get some timeouts no matter what you do, unless you reduce the volume of the transfer. So you need to assess the design requirements to see what MUST be transfered what SHOULD be transfered and what is being transfered because it is there.

The second part of this is the database acquisition.  I can't see how the queries are written, but you should analyse those to see if there is any optimization possible.  This has to be using a huge number of cycles and memory on the server.

Finally the HTML.  Using that single large table is going to cause fits for some browsers, and some of the timeouts may be the result of low memory availablility so the cleint is doing a lot of swapping. All browsers have efficiency problems when rendering tables but the most commonly used browser, IE is the slowest. Breaking it up would help, but the setting of the background localy for every row is also going to slow it down, using CSS class, might impove it slightly.

If it was my task to get it working I would re-organize to reduce the granularity of such huge transfers.  I notice in the data that a very large number of rows will have may fields in common.  Such as the pdf link.  If you group on the common factors and download a list of summaries then the user can select a summary fo delivery of the detail.

In the end unless you reduce the number of rows returned, you are not going to get it significantly faster; there is just too much data being transfered to make it practical on a web page.

LVL 15

Expert Comment

ID: 12008922
I couldn't see the data but it might help if you just passed the data then used javascript to output the <tr>'s and <td>'s etc.
LVL 53

Expert Comment

ID: 12009783
If this was an intranet app you could use a hidden frame and block the data by using XMLHTTP GETs. That would let you do initial rendering for the first block of rows and then let you dynamically expand the table as additional blocks came in and the user would be able to start working as soon as the first block was loaded.  I'm not sure it would be practical across the Internet. User with a slow connection might ended up crashing instead of timeing out; from the look of the table code, the dynamic growing of the table would present some coding challenges.

Windows Server 2016: All you need to know

Learn about Hyper-V features that increase functionality and usability of Microsoft Windows Server 2016. Also, throughout this eBook, you’ll find some basic PowerShell examples that will help you leverage the scripts in your environments!


Author Comment

ID: 12045587
I wanted to thank you for your responses. I had something big come up and am working on it. I just wanted to let you know I had not forgotten about this and I will get back to it as soon as possible.
LVL 53

Expert Comment

ID: 12049553
No problem.  Thanks for letting us know there will be a time lag. :^)

LVL 53

Expert Comment

ID: 12165440
Thanks for the A. :^)


Featured Post

Use Case: Protecting a Hybrid Cloud Infrastructure

Microsoft Azure is rapidly becoming the norm in dynamic IT environments. This document describes the challenges that organizations face when protecting data in a hybrid cloud IT environment and presents a use case to demonstrate how Acronis Backup protects all data.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
RSS Feeds--IE 13 179
I starting with php 12 133
ColdFusion Rereplace 3 80
Microsoft Edge 9 99
Preface This article introduces an authentication and authorization system for a website.  It is understood by the author and the project contributors that there is no such thing as a "one size fits all" system.  That being said, there is a certa…
I will show you how to create a ASP.NET Captcha control without using any HTTP HANDELRS or what so ever. you can easily plug it into your web pages. For Example a = 2 + 3 (where 2 and 3 are 2 random numbers) Session("Answer") = 5 then we…
Viewers will learn one way to get user input in Java. Introduce the Scanner object: Declare the variable that stores the user input: An example prompting the user for input: Methods you need to invoke in order to properly get  user input:
Learn how to create flexible layouts using relative units in CSS.  New relative units added in CSS3 include vw(viewports width), vh(viewports height), vmin(minimum of viewports height and width), and vmax (maximum of viewports height and width).

777 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question