Solved

Fast INSERT new rows in SQL Table

Posted on 2014-10-23
9
153 Views
Last Modified: 2014-10-28
Hi Experts,

I want insert a certain number of rows, containing  only one field with consecutive integers in a SQL Server Table. The only way I know to this is by making use of a loop. This runs fast, but I would like to make it faster.
How can I do this?

I use the following instruction to add 123456 rows to the table Test. In other occasions I use this table to insert another number of rows.

TRUNCATE TABLE Test;
GO
declare @i int
declare @rows_to_insert int
set @i = 1
set @rows_to_insert = 123456

while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
    end


On the webpage http://weblogs.sqlteam.com/jamesn/archive/2008/05/29/60612.aspx I found the code that creates a list of the numbers at lightening speed. But the numbers are only saved in memory and not in a table. Because this approach is beyond my T-SQL knowledge, I didn't even try to alter to code to save numbers.
0
Comment
Question by:GeertWillemarck
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 11

Assisted Solution

by:LordWabbit
LordWabbit earned 167 total points
ID: 40399482
use a transaction

BEGIN transaction
while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
end
commit

Open in new window

0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399496
Oh yeah, and it inserts 123455 not 123456 rows.  Might want to change
while @i < @rows_to_insert

Open in new window

to
while @i <= @rows_to_insert

Open in new window

0
 
LVL 29

Expert Comment

by:Olaf Doschke
ID: 40399516
Since SQL Server is having a transaction log, repeatedly creating such a number or tally table is filling your log fast, it's recommendable to have a table variable at times you need such a table for that matter. Even though filling that table in a transaction helps, as LordWabbit suggests.

You may generate a large numbers table with a  number field for repeated usage and then SELECT TOP N number FROM numbers ORDER BY number.

I haven't tested, but it's even likely generating a table variable on the fly is faster than this TOP N select from a real table.

Bye, Olaf.
0
Get 15 Days FREE Full-Featured Trial

Benefit from a mission critical IT monitoring with Monitis Premium or get it FREE for your entry level monitoring needs.
-Over 200,000 users
-More than 300,000 websites monitored
-Used in 197 countries
-Recommended by 98% of users

 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399596
Mmmm, decided to mess around with the query.
Without the transaction the inserts takes ~20020 milliseconds.
With the transaction it comes way down to ~1066 milliseconds.
Memory table without transaction is ~1583 milliseconds.
Memory table with transaction is ~1320 milliseconds

Which I found surprising, would have thought the memory table would be faster???

Set no count on and it comes down even more to
19223, 853, 1306, 1140 respectively
0
 
LVL 27

Expert Comment

by:Zberteoc
ID: 40399605
Create this function and use it as a table. It is the fastest way:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 10,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [zb_dba_maint].[dbo].[fnTally]()
		select N from [zb_dba_maint].[dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [zb_dba_maint].[dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [zb_dba_maint].[dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --10E+2 or 100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10E+4 or 10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E4
	;

GO

Open in new window

0
 
LVL 29

Assisted Solution

by:Olaf Doschke
Olaf Doschke earned 166 total points
ID: 40399651
Good effort, LordWabbit, but if you took the simple while loop to fill a table variable I guess that's making it slow, approaches using CTEs like Zberteoc uses should be fastest.

I'd also like to see how fast a mere SELECT TOP N from such a permanent table is, if you just fill it once with a large enough amount of numbers once, to cover all cases, and then only select from it, when you need such data, not recreating it.

Since that has to be read into pages, even if you add a NOLOCK, I guess it's only faster in cases the server doesn't purge the pages from it's cache.

Bye, Olaf.
0
 
LVL 27

Accepted Solution

by:
Zberteoc earned 167 total points
ID: 40399685
Actually I tweaked a little bit the function code to return 1,000,000 rows:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 1,000,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [dbo].[fnTally]() where N<=123456 
		select N from [dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	),
   E6(N) AS 
   ( --1000,000 rows max
		SELECT 1 FROM E4 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E6
	;

GO

Open in new window

And then you can filter it by using:

select * from [dbo].[fnTally]() where N<=123456
0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399699
Definitely the fastest - 520 milliseconds
0
 

Author Closing Comment

by:GeertWillemarck
ID: 40408380
Thank you guys for this really interesting discussion. I learned a lot from it and lead to a very fast table generating sql.
0

Featured Post

Microsoft Certification Exam 74-409

Veeam® is happy to provide the Microsoft community with a study guide prepared by MVP and MCT, Orin Thomas. This guide will take you through each of the exam objectives, helping you to prepare for and pass the examination.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Ever wondered why sometimes your SQL Server is slow or unresponsive with connections spiking up but by the time you go in, all is well? The following article will show you how to install and configure a SQL job that will send you email alerts includ…
In part one, we reviewed the prerequisites required for installing SQL Server vNext. In this part we will explore how to install Microsoft's SQL Server on Ubuntu 16.04.
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.

636 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question