• Status: Solved
  • Priority: Medium
  • Security: Public
  • Views: 200
  • Last Modified:

Copy large amount of records via stored procedure?

Hello,

I have a view which returns close to 5 million rows of data.  I need to copy this data into a table of the same structure.

Is it possible to create a stored procedure which copies the data over in chunks (so it doesn't timeout)?

If so can you give an example or a link to a tutorial?

Thanks,
-Torrwin
0
Torrwin
Asked:
Torrwin
  • 3
  • 3
1 Solution
 
QPRCommented:
Is this a one off?
You might want to consider creating a SSIS package.
Or use BCP
0
 
TorrwinAuthor Commented:
I'm not sure what a one off is, but I need to be able to execute it from a VB.NET app, so I don't think those are options.
0
 
QPRCommented:
A "one off" is an event you only plan to do once.

http://www.vbrad.com/article.aspx?id=10
Explains the benefits of bcp over using stored procedures (speed) and has some vb class/exe that maybe you could include in your project?
0
Free Tool: ZipGrep

ZipGrep is a utility that can list and search zip (.war, .ear, .jar, etc) archives for text patterns, without the need to extract the archive's contents.

One of a set of tools we're offering as a way to say thank you for being a part of the community.

 
TorrwinAuthor Commented:
No, this will happen several times a month.

Also, BCP appears to need the source to be a file, not a view.
0
 
Ken SelviaRetiredCommented:

select * into newTable from MyView

will transfer all rows into newTable, but 5 million rows is relatively small as far as I'm concerned, and that statement would not "time out" in any case. It will run until it completes, which should be 3-15 seconds depending on the complexity of the view.

The answer is yes you could make a stored procedure return chunks or ranges of data, but the problem probably lies in the view and retrieving ranges from that is probably not going to improve performance, and making a stored procedure return portions of data from it is probably not the best approach.

To get the first 100 rows

select top 1000 * from mytable

to get the 2nd 1000 rows

select top 1000 * from
( select top 2000 * from mytable order by ID desc )
order by ID ASC

where ID is the unique key in the result set

Make 5 versions of that statement, each retrieving 1 million rows.  I can pretty much guarantee that will perform far worse than what you have now though.

Can we see your view? Maybe we can suggest ways to improve the performance.

And yes you can use a view with bcp using the queryout option.

0
 
TorrwinAuthor Commented:
Basically, the BCP suggestion led me to discover the SQLBulkCopy commands which are BCP-based.
0

Featured Post

Take Control of Web Hosting For Your Clients

As a web developer or IT admin, successfully managing multiple client accounts can be challenging. In this webinar we will look at the tools provided by Media Temple and Plesk to make managing your clients’ hosting easier.

  • 3
  • 3
Tackle projects and never again get stuck behind a technical roadblock.
Join Now