Best Method to Insert Millions of rows

All -

I'm creating a big table with 500M rows. The following script is running fine but slow. How to improve the performace?

Got any idea?

Thanks,

~Sve
use MyDB
go

DECLARE @start_time DATETIME, @end_time DATETIME
SET @start_time = CURRENT_TIMESTAMP

set nocount on
-- Insert rows with random values
DECLARE @row INT;
DECLARE @clmn INT;
DECLARE @string VARCHAR(50), @fullname VARCHAR(20),@length INT, @code INT;
SET @row = 0;
SET @clmn= 0;

WHILE @row < 1M  
BEGIN
   SET @row = @row + 1;
   -- Build the random string
   SET @length = ROUND(80*RAND(),0);
   SET @string = '';
   WHILE @length > 0 BEGIN
      SET @length = @length - 1;
      SET @code = ROUND(32*RAND(),0) - 6;
      IF @code BETWEEN 1 AND 26 
         SET @string = @string + CHAR(ASCII('a')+@code-1);
      ELSE
         SET @string = @string + ' ';
      END 
      -- insert rows to UserInfo table
      INSERT INTO [MyDB].[USER].[USERINFO]
      VALUES
      ('A'+SUBSTRING(@string, 1, 19),
       '1'+left(ltrim(str(rand()*rand()*10000000000,10,0)+replicate('0',10)),10),
      NULL,NULL,NULL)
   -- Ready for the record
   WHILE @clmn < 500 
   BEGIN
   SET @clmn = @clmn + 1;
   SET NOCOUNT ON;
   INSERT INTO [MyDB].[Msg].[Msg] VALUES (
     -- @row,
      @string,
      '1'+left(ltrim(str(rand()*rand()*10000000000,10,0)+replicate('0',10)),10),
      '1'+left(ltrim(str(rand()*rand()*10000000000,10,0)+replicate('0',10)),10),
      GETDATE() -1,
      CONVERT(DATETIME, ROUND(60000*RAND()-30000,9)),
      0,
      'Delivered',
      @string,
      @string,
      null,
      null   )

END
set @clmn = 0
END
GO

Open in new window

LVL 18
sventhanAsked:
Who is Participating?

[Webinar] Streamline your web hosting managementRegister Today

x
 
arnoldConnect With a Mentor Commented:
What is it you are testing?
You can move getdate() -1 out and assign it once to a variable.
set date =getdate() -1; for use in the nested while loop.

The outer userinfo data is around 70 characters per row and can come up to 70G all items in.
The nested loop is around 300 characters per row.

How frequently do you need to run this script?
Depending on what you are testing, you could leave the userinfo that you generated in the past alone, and only generate the new msg table.
I.e. run a cursor for the userinfo getting out the string column, and then running the inner 500 entries per string insert.

Another option is to break it down in terms of how many rows you create in a run/per file. i.e. instead of 1M and 500 per,
run three loops
outer does nothing other than to manage the file size 200
50,000 userinfo
500 msg for each user info.

The files will be 4MB for userinfo + 150KB for msg

Prior to the end of the outer most loop, you bulk import the two csv file created by the two inner loops.




0
 
sameer2010Commented:
I would suggest bcp or bulk import.
0
 
sventhanAuthor Commented:
Thanks  Sameer.

How can I convert the above code into a bcp?
0
Get your problem seen by more experts

Be seen. Boost your question’s priority for more expert views and faster solutions

 
sameer2010Commented:
Write all of these to a file and then use BCP.
0
 
arnoldCommented:
If the data is in a CSV format, you can import/load them in as semeer2010.
http://blog.sqlauthority.com/2008/02/06/sql-server-import-csv-file-into-sql-server-using-bulk-insert-load-comma-delimited-file-into-sql-server/

bulk insert tablename from 'filename.csv' with {}
Note that if you have commas in the column data, you would need to make changesl
i.e.
"lastname, firstname",ser,sds,"this is a message, and this is the data."

http://stackoverflow.com/questions/782353/sql-server-bulk-insert-of-csv-file-with-inconsistent-quotes



0
 
sventhanAuthor Commented:
Thanks Arnold.

I'm talking about 500M rows of data. I'll look into the comments.

Is there anything else can I do?

Droping the primary key / index before the load, etc...

0
 
arnoldCommented:
The alternative is to use a more robust option to generate the data i.e. outside the SQL.

The use of rand so many times per data points adds to the slowdown.
Generate the data outside SQL then import the CSV.
You have two tables one with 1Million rows and the other with 500Million rows.

To make sure there is no posibility of length being 0, use an offset
SET @length = ROUND(79*RAND()+1,0);
length will be between 1 and 80

Does the 10 digit number have to be a string? (left,ltrim, replicate) add to the overhead.
0
 
sventhanAuthor Commented:
Good Point Arnold. I already noticed and I moved all the STATIC values outside of the loop.
The MSG table is daily partitioned and has 50 parts for last couple of months. I has a primary key on the msgid column. Can I drop them before the load? I can create them again once the load is completed.

If I take this to CSV I need to worry about the SPACE. If thats the good way then I'll take that route.

Thanks again.

~sve.
0
 
sventhanAuthor Commented:
Sounds  Great.

I almost did whatever you've described except creating the CSV file. We're running a messaging system and would like to check the performance when the table grow bigger. The tester wanted to run their testing on this big table and I've to create them an aged DB.

Hey, thanks for your kind help and I'll finish this up tomorrow.

~sve.
0
 
arnoldCommented:
You may need to redesign/reachitect the database setup since it does not seem right to have the messaging grow this big.
You might want to partition the messaging table by age of the messages.
This way you will manage the number of records queried to the most recent.

The other problem with the randomly generated non-context based data points, you can not improve the performance by adding indexes, etc. Presumably the only indexes you have is the sender/recipient.
Not sure if the two numeric strings are references to something else. Similarly not sure what two date fields supposed to represent?
0
 
sventhanAuthor Commented:
Thanks Arnold.

It works like a champ.
0
All Courses

From novice to tech pro — start learning today.