Go Premium for a chance to win a PS4. Enter to Win

x
?
Solved

Fast INSERT new rows in SQL Table

Posted on 2014-10-23
9
Medium Priority
?
168 Views
Last Modified: 2014-10-28
Hi Experts,

I want insert a certain number of rows, containing  only one field with consecutive integers in a SQL Server Table. The only way I know to this is by making use of a loop. This runs fast, but I would like to make it faster.
How can I do this?

I use the following instruction to add 123456 rows to the table Test. In other occasions I use this table to insert another number of rows.

TRUNCATE TABLE Test;
GO
declare @i int
declare @rows_to_insert int
set @i = 1
set @rows_to_insert = 123456

while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
    end


On the webpage http://weblogs.sqlteam.com/jamesn/archive/2008/05/29/60612.aspx I found the code that creates a list of the numbers at lightening speed. But the numbers are only saved in memory and not in a table. Because this approach is beyond my T-SQL knowledge, I didn't even try to alter to code to save numbers.
0
Comment
Question by:GeertWillemarck
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 11

Assisted Solution

by:LordWabbit
LordWabbit earned 668 total points
ID: 40399482
use a transaction

BEGIN transaction
while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
end
commit

Open in new window

0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399496
Oh yeah, and it inserts 123455 not 123456 rows.  Might want to change
while @i < @rows_to_insert

Open in new window

to
while @i <= @rows_to_insert

Open in new window

0
 
LVL 30

Expert Comment

by:Olaf Doschke
ID: 40399516
Since SQL Server is having a transaction log, repeatedly creating such a number or tally table is filling your log fast, it's recommendable to have a table variable at times you need such a table for that matter. Even though filling that table in a transaction helps, as LordWabbit suggests.

You may generate a large numbers table with a  number field for repeated usage and then SELECT TOP N number FROM numbers ORDER BY number.

I haven't tested, but it's even likely generating a table variable on the fly is faster than this TOP N select from a real table.

Bye, Olaf.
0
Independent Software Vendors: We Want Your Opinion

We value your feedback.

Take our survey and automatically be enter to win anyone of the following:
Yeti Cooler, Amazon eGift Card, and Movie eGift Card!

 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399596
Mmmm, decided to mess around with the query.
Without the transaction the inserts takes ~20020 milliseconds.
With the transaction it comes way down to ~1066 milliseconds.
Memory table without transaction is ~1583 milliseconds.
Memory table with transaction is ~1320 milliseconds

Which I found surprising, would have thought the memory table would be faster???

Set no count on and it comes down even more to
19223, 853, 1306, 1140 respectively
0
 
LVL 27

Expert Comment

by:Zberteoc
ID: 40399605
Create this function and use it as a table. It is the fastest way:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 10,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [zb_dba_maint].[dbo].[fnTally]()
		select N from [zb_dba_maint].[dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [zb_dba_maint].[dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [zb_dba_maint].[dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --10E+2 or 100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10E+4 or 10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E4
	;

GO

Open in new window

0
 
LVL 30

Assisted Solution

by:Olaf Doschke
Olaf Doschke earned 664 total points
ID: 40399651
Good effort, LordWabbit, but if you took the simple while loop to fill a table variable I guess that's making it slow, approaches using CTEs like Zberteoc uses should be fastest.

I'd also like to see how fast a mere SELECT TOP N from such a permanent table is, if you just fill it once with a large enough amount of numbers once, to cover all cases, and then only select from it, when you need such data, not recreating it.

Since that has to be read into pages, even if you add a NOLOCK, I guess it's only faster in cases the server doesn't purge the pages from it's cache.

Bye, Olaf.
0
 
LVL 27

Accepted Solution

by:
Zberteoc earned 668 total points
ID: 40399685
Actually I tweaked a little bit the function code to return 1,000,000 rows:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 1,000,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [dbo].[fnTally]() where N<=123456 
		select N from [dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	),
   E6(N) AS 
   ( --1000,000 rows max
		SELECT 1 FROM E4 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E6
	;

GO

Open in new window

And then you can filter it by using:

select * from [dbo].[fnTally]() where N<=123456
0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399699
Definitely the fastest - 520 milliseconds
0
 

Author Closing Comment

by:GeertWillemarck
ID: 40408380
Thank you guys for this really interesting discussion. I learned a lot from it and lead to a very fast table generating sql.
0

Featured Post

What does it mean to be "Always On"?

Is your cloud always on? With an Always On cloud you won't have to worry about downtime for maintenance or software application code updates, ensuring that your bottom line isn't affected.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

An alternative to the "For XML" way of pivoting and concatenating result sets into strings, and an easy introduction to "common table expressions" (CTEs). Being someone who is always looking for alternatives to "work your data", I came across this …
Microsoft Access has a limit of 255 columns in a single table; SQL Server allows tables with over 255 columns, but reading that data is not necessarily simple.  The final solution for this task involved creating a custom text parser and then reading…
Using examples as well as descriptions, and references to Books Online, show the documentation available for date manipulation functions and by using a select few of these functions, show how date based data can be manipulated with these functions.
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.

972 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question