Solved

Fast INSERT new rows in SQL Table

Posted on 2014-10-23
9
147 Views
Last Modified: 2014-10-28
Hi Experts,

I want insert a certain number of rows, containing  only one field with consecutive integers in a SQL Server Table. The only way I know to this is by making use of a loop. This runs fast, but I would like to make it faster.
How can I do this?

I use the following instruction to add 123456 rows to the table Test. In other occasions I use this table to insert another number of rows.

TRUNCATE TABLE Test;
GO
declare @i int
declare @rows_to_insert int
set @i = 1
set @rows_to_insert = 123456

while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
    end


On the webpage http://weblogs.sqlteam.com/jamesn/archive/2008/05/29/60612.aspx I found the code that creates a list of the numbers at lightening speed. But the numbers are only saved in memory and not in a table. Because this approach is beyond my T-SQL knowledge, I didn't even try to alter to code to save numbers.
0
Comment
Question by:GeertWillemarck
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 4
  • 2
  • 2
  • +1
9 Comments
 
LVL 11

Assisted Solution

by:LordWabbit
LordWabbit earned 167 total points
ID: 40399482
use a transaction

BEGIN transaction
while @i < @rows_to_insert
    begin
    INSERT INTO Test VALUES (@i)
    set @i = @i + 1
end
commit

Open in new window

0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399496
Oh yeah, and it inserts 123455 not 123456 rows.  Might want to change
while @i < @rows_to_insert

Open in new window

to
while @i <= @rows_to_insert

Open in new window

0
 
LVL 29

Expert Comment

by:Olaf Doschke
ID: 40399516
Since SQL Server is having a transaction log, repeatedly creating such a number or tally table is filling your log fast, it's recommendable to have a table variable at times you need such a table for that matter. Even though filling that table in a transaction helps, as LordWabbit suggests.

You may generate a large numbers table with a  number field for repeated usage and then SELECT TOP N number FROM numbers ORDER BY number.

I haven't tested, but it's even likely generating a table variable on the fly is faster than this TOP N select from a real table.

Bye, Olaf.
0
NEW Veeam Agent for Microsoft Windows

Backup and recover physical and cloud-based servers and workstations, as well as endpoint devices that belong to remote users. Avoid downtime and data loss quickly and easily for Windows-based physical or public cloud-based workloads!

 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399596
Mmmm, decided to mess around with the query.
Without the transaction the inserts takes ~20020 milliseconds.
With the transaction it comes way down to ~1066 milliseconds.
Memory table without transaction is ~1583 milliseconds.
Memory table with transaction is ~1320 milliseconds

Which I found surprising, would have thought the memory table would be faster???

Set no count on and it comes down even more to
19223, 853, 1306, 1140 respectively
0
 
LVL 26

Expert Comment

by:Zberteoc
ID: 40399605
Create this function and use it as a table. It is the fastest way:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 10,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [zb_dba_maint].[dbo].[fnTally]()
		select N from [zb_dba_maint].[dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [zb_dba_maint].[dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [zb_dba_maint].[dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --10E+2 or 100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10E+4 or 10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E4
	;

GO

Open in new window

0
 
LVL 29

Assisted Solution

by:Olaf Doschke
Olaf Doschke earned 166 total points
ID: 40399651
Good effort, LordWabbit, but if you took the simple while loop to fill a table variable I guess that's making it slow, approaches using CTEs like Zberteoc uses should be fastest.

I'd also like to see how fast a mere SELECT TOP N from such a permanent table is, if you just fill it once with a large enough amount of numbers once, to cover all cases, and then only select from it, when you need such data, not recreating it.

Since that has to be read into pages, even if you add a NOLOCK, I guess it's only faster in cases the server doesn't purge the pages from it's cache.

Bye, Olaf.
0
 
LVL 26

Accepted Solution

by:
Zberteoc earned 167 total points
ID: 40399685
Actually I tweaked a little bit the function code to return 1,000,000 rows:
CREATE FUNCTION [dbo].[fnTally]()
RETURNS TABLE --WITH SCHEMABINDING 
AS
/*******************************************************************************\
Function	: fnTally

Purpose		: returns a set with numbers from 1 to 1,000,000 
			  to be used in parsing and sequential data generation whithout loop
			  
Parameters	: no parameters

Invoke		:
	
		select * from [dbo].[fnTally]() where N<=123456 
		select N from [dbo].[fnTally]()
		select substring('abcdef',N,1) as chr from [dbo].[fnTally]() where N<len('abcdef') -- parsing a string
		select dateadd(dd, N, '2007-01-01') as dte from [dbo].[fnTally]() --gets dates for about 30 years

\*******************************************************************************/
RETURN
	WITH 
	E1(N) AS 
	( --10E+1 or 10 rows
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL
		 SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1 UNION ALL SELECT 1
	),                         
   E2(N) AS 
   ( --100 rows	
		SELECT 1 FROM E1 a, E1 b
	),
   E4(N) AS 
   ( --10,000 rows max
		SELECT 1 FROM E2 a, E2 b
	),
   E6(N) AS 
   ( --1000,000 rows max
		SELECT 1 FROM E4 a, E2 b
	)
			 SELECT ROW_NUMBER() OVER (ORDER BY (SELECT NULL)) as N FROM E6
	;

GO

Open in new window

And then you can filter it by using:

select * from [dbo].[fnTally]() where N<=123456
0
 
LVL 11

Expert Comment

by:LordWabbit
ID: 40399699
Definitely the fastest - 520 milliseconds
0
 

Author Closing Comment

by:GeertWillemarck
ID: 40408380
Thank you guys for this really interesting discussion. I learned a lot from it and lead to a very fast table generating sql.
0

Featured Post

Complete VMware vSphere® ESX(i) & Hyper-V Backup

Capture your entire system, including the host, with patented disk imaging integrated with VMware VADP / Microsoft VSS and RCT. RTOs is as low as 15 seconds with Acronis Active Restore™. You can enjoy unlimited P2V/V2V migrations from any source (even from a different hypervisor)

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Introduction In my previous article (http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SSIS/A_9150-Loading-XML-Using-SSIS.html) I showed you how the XML Source component can be used to load XML files into a SQL Server database, us…
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Via a live example, show how to backup a database, simulate a failure backup the tail of the database transaction log and perform the restore.
Via a live example, show how to setup several different housekeeping processes for a SQL Server.

749 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question