Copy large amount of records via stored procedure?

Hello,

I have a view which returns close to 5 million rows of data.  I need to copy this data into a table of the same structure.

Is it possible to create a stored procedure which copies the data over in chunks (so it doesn't timeout)?

If so can you give an example or a link to a tutorial?

Thanks,
-Torrwin
LVL 13
TorrwinAsked:
Who is Participating?

Improve company productivity with a Business Account.Sign Up

x
 
QPRConnect With a Mentor Commented:
Is this a one off?
You might want to consider creating a SSIS package.
Or use BCP
0
 
TorrwinAuthor Commented:
I'm not sure what a one off is, but I need to be able to execute it from a VB.NET app, so I don't think those are options.
0
 
QPRCommented:
A "one off" is an event you only plan to do once.

http://www.vbrad.com/article.aspx?id=10
Explains the benefits of bcp over using stored procedures (speed) and has some vb class/exe that maybe you could include in your project?
0
Build your data science skills into a career

Are you ready to take your data science career to the next step, or break into data science? With Springboard’s Data Science Career Track, you’ll master data science topics, have personalized career guidance, weekly calls with a data science expert, and a job guarantee.

 
TorrwinAuthor Commented:
No, this will happen several times a month.

Also, BCP appears to need the source to be a file, not a view.
0
 
Ken SelviaRetiredCommented:

select * into newTable from MyView

will transfer all rows into newTable, but 5 million rows is relatively small as far as I'm concerned, and that statement would not "time out" in any case. It will run until it completes, which should be 3-15 seconds depending on the complexity of the view.

The answer is yes you could make a stored procedure return chunks or ranges of data, but the problem probably lies in the view and retrieving ranges from that is probably not going to improve performance, and making a stored procedure return portions of data from it is probably not the best approach.

To get the first 100 rows

select top 1000 * from mytable

to get the 2nd 1000 rows

select top 1000 * from
( select top 2000 * from mytable order by ID desc )
order by ID ASC

where ID is the unique key in the result set

Make 5 versions of that statement, each retrieving 1 million rows.  I can pretty much guarantee that will perform far worse than what you have now though.

Can we see your view? Maybe we can suggest ways to improve the performance.

And yes you can use a view with bcp using the queryout option.

0
 
TorrwinAuthor Commented:
Basically, the BCP suggestion led me to discover the SQLBulkCopy commands which are BCP-based.
0
Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.

All Courses

From novice to tech pro — start learning today.