Solved

VB.net Select/Copy very large tables

Posted on 2015-01-23
5
157 Views
Last Modified: 2015-01-24
Hi

I have a VB.net Windows forms application where I have to count the records in SQL tables of 65 to 100 million records. I also have to pull data into a DataGridView.

Is there a class that I can use to make this faster? I have used the SQLBulkCopy class to upload data. I am looking
for something similar to pull data faster
0
Comment
Question by:Murray Brown
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 3
  • 2
5 Comments
 
LVL 143

Expert Comment

by:Guy Hengel [angelIII / a3]
ID: 40566031
if you want to display 100M rows on the client, there will be no fast thing.
unless you want to "page" (and obviously filter) the data, so the end-user can reduce the number of records that need to be transferred from the server over the whole networkwork infrastructure and to be put into the end-users computer RAM. I see many times where a developer forgets that resources are limited, especially on normal end-user computers.
the lesser rows you bring to the user, the better (for EVERYbody)

you will need to clarify exactly what are the needs for this form, and how to reduce the data transfer between the machines.
just consider 100M rows, each how having 1KB size each (which can be considered a short row, actually).
total size of all those records would be around 100GB. how much RAM does your machine have? how much disk space does your machine have?

next point is that anything above 100 records (at a time) for a normal end-user is unusable, he will just be scrolling "all day".

note: there are exceptions, of course, but those are very rare.
0
 

Author Comment

by:Murray Brown
ID: 40566045
Thanks Guy. What about counting records
0
 
LVL 143

Accepted Solution

by:
Guy Hengel [angelIII / a3] earned 500 total points
ID: 40566052
just have the total count of the full table with no filters?
there, you have some options, either slow and accurate, or fast and inaccurate, or not real-time:
http://blogs.msdn.com/b/martijnh/archive/2010/07/15/sql-server-how-to-quickly-retrieve-accurate-row-count-for-table.aspx
if this is a counting based on some fields, the above methods won't work at all.
depending on the needs (mainly on how real-time the information needs to be), an hourly job on the db that does the extraction to a staging table could do the job, and the user can look at that table which will give "instant and fast" results, but with a "timestamp" to tell him; this was the sitation like 12 minutes ago...
0
 

Author Closing Comment

by:Murray Brown
ID: 40567915
Thanks very much
0
 

Author Comment

by:Murray Brown
ID: 40567916
Great answer!
0

Featured Post

PeopleSoft Has Never Been Easier

PeopleSoft Adoption Made Smooth & Simple!

On-The-Job Training Is made Intuitive & Easy With WalkMe's On-Screen Guidance Tool.  Claim Your Free WalkMe Account Now

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
I have a large data set and a SSIS package. How can I load this file in multi threading?
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.
Viewers will learn how to use the SELECT statement in SQL and will be exposed to the many uses the SELECT statement has.

623 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question