Solved

Stored Proceder twice as slow as Transact-SQL

Posted on 2004-08-16
19
480 Views
Last Modified: 2008-02-01
Hi,

I've been working on some code in a VB.NET program to insert a lot (85,000) rows into an MS SQL 2000 table.

Originally started using Transact-SQL, then thought about using Stored Procedures, make my VB code look better, and to make it quicker.

But I've found that the same INSERT statement runs twice as slow when as a stored procedure.

My original SQL:
---------------
INSERT INTO cxacyr_mike
(Col001, Col002, Col003, Col004, Col005, Col006, Col007, Col008, Col009, Col010, Col011, Col012, Col013, Col014, Col015, Col016, Col017, Col018, Col019, Col020, Col021, Col022, Col023, Col024, Col025, Col026, Col027, Col028)
VALUES
('01','90000014','2004','no','804.03','0','0','0','0','0','0','324.03','0','0','0','0','0','480','0','0','DEANA','70479','28/02/04','CXYRENDA.P','','0','?','')
---------------

The stored procedure:
---------------
CREATE proc cxacyr_insert
@Col001 char(255),
@Col002 char(255),
@Col003 char(255),
@Col004 char(255),
@Col005 char(255),
@Col006 char(255),
@Col007 char(255),
@Col008 char(255),
@Col009 char(255),
@Col010 char(255),
@Col011 char(255),
@Col012 char(255),
@Col013 char(255),
@Col014 char(255),
@Col015 char(255),
@Col016 char(255),
@Col017 char(255),
@Col018 char(255),
@Col019 char(255),
@Col020 char(255),
@Col021 char(255),
@Col022 char(255),
@Col023 char(255),
@Col024 char(255),
@Col025 char(255),
@Col026 char(255),
@Col027 char(255),
@Col028 char(255)
AS

INSERT INTO cxacyr_mike
(Col001, Col002, Col003, Col004, Col005, Col006, Col007, Col008, Col009, Col010, Col011,
Col012, Col013, Col014, Col015, Col016, Col017, Col018, Col019, Col020, Col021, Col022,
Col023, Col024, Col025, Col026, Col027, Col028)
VALUES (@Col001, @Col002, @Col003, @Col004, @Col005, @Col006, @Col007, @Col008, @Col009, @Col010, @Col011,
@Col012, @Col013, @Col014, @Col015, @Col016, @Col017, @Col018, @Col019, @Col020, @Col021, @Col022,
@Col023, @Col024, @Col025, @Col026, @Col027, @Col028)
GO
---------------

Inserting 834 rows takes 6 seconds with T-SQL, but it takes 14 seconds with the SP.

Unfortunately I cannot change the column names or table structure because it's part of an ancient rubbish system.

I've found these on EE, but they are dealing with slightly more than an insert:
http://www.experts-exchange.com/Q_20894328.html
http://www.experts-exchange.com/Q_20729691.html
Tried the ideas mentioned, but nothing helped

When I do an Execution Plan in Query Analizer, it just 2 steps:
"Table Insert/Insert" Cost:100%
"Insert" Cost:0%

Adding "with recompile" to the SP slowed it down even more.

Any ides what I could do, at the moment I'm leaving it with the T-SQL!

Cheers
Mike
0
Comment
Question by:bealem
  • 7
  • 4
  • 3
  • +3
19 Comments
 
LVL 69

Accepted Solution

by:
ScottPletcher earned 50 total points
ID: 11810566
That seems pretty logical actually.  There has to be some overhead in creating and populating all those variables.  For that reason, for inserts, non-SP T-SQL probably will always be faster and more efficient, and the more columns/length there are, the more noticeable the difference in performance will be.
0
 
LVL 17

Assisted Solution

by:BillAn1
BillAn1 earned 50 total points
ID: 11810662
You might want to try reducing your parameter sizes in your function, particularly as they are much larger than the actual data.
try making them varchars, instead. Even though the db might be char(255), you could treat them as varchar, and let the db automatically pad them out when the insert is done. It will probably be quicker, considering the size & # of them you have.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 11811430
And you could of course insert all 834 rows at once.  That should be faster.
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11811499
>> And you could of course insert all 834 rows at once.  That should be faster. <<

Very much so, but that's going to be very difficult via a sp.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 11813110
>>Very much so, but that's going to be very difficult via a sp.<<
Not difficult passing an Xml document and using OpenXml.  But we can probably debate as to whether there would be any speed imprrovement.  But a real-world test would probably put this subject to rest :)
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11813126
I would think just the overhead of parsing the Xml would remove it from performance contention.

Supposedly Yukon/2005 has a much better/faster parsing engine built into SQL.
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 11813356
>>I would think just the overhead of parsing the Xml would remove it from performance contention.<<
I beg to differ on this one.  Calling the same stored procedure 834 has to be slower than calling it once and inserting all 834 rows at once using OpenXml.  But I guess the questioner should let us know, if Xml is even an option.
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11813379
Yes, I was referring to the overhead versus inserting all the data without using a SP or XML, just "straight" insert(s).  The conversion to XML, and subsequent de-coding, is a lot of overhead however one looks at it, especially since the length will be greater than 8000 bytes.
0
 
LVL 9

Expert Comment

by:SoftEng007
ID: 11814721
OK Scott and arbert... Open for discussion.

>>>I've been working on some code in a VB.NET program to insert a lot (85,000) rows into an MS SQL 2000 table.

given the 85K rows to insert.

option 1. dump rows to text file and Bulk Insert
option 2. dump rows to text file call DTS package to load table.
option 3. export rows to XML and OPENXML load table

option4. ????


My choice: I favor DTS when possible because of the error trapping,logging and transformation capabilites.
I would choose option 2 and call the DTS package using dtsrun from the app or from a proc.


0
Top 6 Sources for Identifying Threat Actor TTPs

Understanding your enemy is essential. These six sources will help you identify the most popular threat actor tactics, techniques, and procedures (TTPs).

 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11814780
BULK INSERT is actually faster than DTS for loading rows, so my choice would be BI.
0
 
LVL 9

Expert Comment

by:SoftEng007
ID: 11817058
I agree Scott, but Can you log exceptions to a formatted file and continue to process or use transforms on each column using BULK INSERT? Oh and send an email with the exception file to a third party for examination??
0
 
LVL 6

Assisted Solution

by:gwalkeriq
gwalkeriq earned 120 total points
ID: 11817086
Assuming you can't do use bulk insert, bcp, or DTS, xml you can still speed things up considerably by avoiding server round trips.

In your original code, you are apparently creating this string dynamically via client side code:

INSERT INTO cxacyr_mike
(Col001, Col002, Col003, Col004, Col005, Col006, Col007, Col008, Col009, Col010, Col011, Col012, Col013, Col014, Col015, Col016, Col017, Col018, Col019, Col020, Col021, Col022, Col023, Col024, Col025, Col026, Col027, Col028)
VALUES
('01','90000014','2004','no','804.03','0','0','0','0','0','0','324.03','0','0','0','0','0','480','0','0','DEANA','70479','28/02/04','CXYRENDA.P','','0','?','')

For any of the options below, you'll need watch how big your strings grow, limit # of rows you pass with each command.

Option 1) build multiple insert strings delimited by semicolons and submit multiple rows worth of inserts with a single command. Implementation is fairly obvious, total speedup is limited because SQL server will spend lots of time parsing.

Option 1a) Similar to 1, but build each line and pass as separate parameters to a SQL proc. In the sql proc, using dynamic sql to execute each command 1 line at a time. Since the lines will all be very similar, SQL should be able to cache the SQL even though it is dynamically executed. -- don't forget to call set nocount on at the top of your proc


Option 2) build one string per row and insert the rows using a single stored proc call -- In this case, you only need the data portion of the insert clause as part of the string. Description follows.

You could a string similar to this for each row to insert, then a stored proc like this:

proc InsertMulti (row1 varchar(1024), row2 varchar(1024)=null, row3 varchar(1024)=null, etc.

Then, after creating the insertion text for n rows parse the row within your sql proc. Since parsing comma seperated data is non-trival (esp. given limitations of Transact SQL), you will probably find it simpler and faster to use a separator character for each field, making sure the separator character cannot appear in the data.

e.g., using vertical bar as the separator character (likely not a good choice, but easy to understand)

row1= '01|90000014|2004|no|804.03|0|0|0|0|0|0|324.03|0|0|0|0|0|480|0|0|DEANA|70479|28/02/04|CXYRENDA.P||0|?'

Then, build a parsing loop like

declare n int, s varchar(1024)
declare @Col001 varchar(255) -- etc. for all columns

-- set s to row1 first time thru loop, row2 next time, etc.
set s = row1

n = charindex(s, '|')
select @Col001 = left(s, n-1), s = right(s, len(s)-n)
n = charindex(s, '|')
select @Col002 = left(s, n-1), s = right(s, len(s)-n)
n = charindex(s, '|')
select @Col003 = left(s, n-1), s = right(s, len(s)-n)
-- etc.

Then issue the actual insert statement

INSERT INTO cxacyr_mike
(Col001, Col002, Col003, etc.)
VALUES
(@Col001,@Col002,@Col003, etc.)


For any of these options, you'll have to time your system to get the actual speedups because it is difficult to predict which will be fastest
0
 

Author Comment

by:bealem
ID: 11819934
Hi,

Thanks for all your ideas.

BillAn1 -
Good idea, they were actially varchars in the table anyway, I just didn't notice!
Unfortunately, made no difference with the SP...

gwalkeriq -
I've tried making a string of more than one INSERT statement at a time, I experemented with different sizes, from 10 to 100 INSERT statements in one string, from 10 to about 50 no difference in speed, but it was slower with higher numbers, worth using tho as it cuts down on trpis to & from the server as you say.

I didn't really understand you 2nd idea with the stored procedure! Sorry!



I've started to play with BULK INSERT.
Writing out the rows to a text file, then doing a BULK IMPORT on them.
It's very quick, but I'm having a problem with it, when the BULK IMPORT command is executed from my VB.Net code, I always get the following error:
Could not bulk insert because file 'c:\mike.txt' could not be opened. Operating system error code 32(error not found)

I've tried making the file on my local hard drive, on the SQL server, and changing the path in the BULK INSERT command, but it never finds the file.

If I make the text file on the SQL Server, then run the BULK INSERT from Query Analizer, it will run fine, but just will not from my VB.Net code?

I've tried:
BULK INSERT cxacyr_mike FROM 'c:\mike.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )

BULK INSERT cxacyr_mike FROM '\\server\c$\mike.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )

BULK INSERT cxacyr_mike FROM '\\server\c$\Program Files\Microsoft SQL Server\MSSQL\Data\mike.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )

I can put any of those as they are into Query Analizer & they will work, just not from my VB.Net Code.

Cheers
Mike
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11821070
The path should be as seen by the server.  That is, if you place the file in the root of SQL's C: drive, then this "path":
c:\mike.txt
should find it.
0
 

Author Comment

by:bealem
ID: 11821124
Hi Scott,

Thats what I would have thought.
When I use the command:
BULK INSERT cxacyr_mike FROM 'c:\mike.txt' WITH ( FIELDTERMINATOR = ',', ROWTERMINATOR = '\n' )
I'm creating it as "\\server\c$\mike.txt" from my VB.Net code.

I also put the mike.txt on MY c: drive just in case the SQL server was looking there, but no!
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 11821330
Was the file actually created at that location on the server?
0
 
LVL 9

Assisted Solution

by:SoftEng007
SoftEng007 earned 30 total points
ID: 11821490
does the user your connecting to the server with (the connection string in vb.net code)
have permissions to access the file?

change your connections settings to sa or a user in the sysyadmin role temporarily to see if the proc works.
0
 

Author Comment

by:bealem
ID: 11839307
Okay....

Scott:
Yes, the file is always created successfully on the server.


SoftEng007:
This is what I was thinking, user access.
I've set the SQL user I'm logging in as the sysadmin of the SQL server (gave it all server rolls actually!)
I but the text file in a new folder, and gave the NT user EVERYONE full controll to the folder & text file.

Also tried the SQL connection with my NT user account as I'm an admin of that server.

But no matter what combinations I try, I always get that same error:
"Could not bulk insert because file 'c:\testfolder\mike.txt' could not be opened. Operating system error code 32(error not found)"

Thanks for all your help so far :-)
0
 

Author Comment

by:bealem
ID: 11839381
Update...

Just had a sudden flash of inspiration (does not happen often!)

I put the BULK INSERT command into a stored procedure, gave my SQL user permission to run the new SP, and BINGO, it worked!

Not sure why I didn't think of that before!!!!

Cheers for all your tips & ideas, some usefull stuff there.
0

Featured Post

How your wiki can always stay up-to-date

Quip doubles as a “living” wiki and a project management tool that evolves with your organization. As you finish projects in Quip, the work remains, easily accessible to all team members, new and old.
- Increase transparency
- Onboard new hires faster
- Access from mobile/offline

Join & Write a Comment

Suggested Solutions

When you hear the word proxy, you may become apprehensive. This article will help you to understand Proxy and when it is useful. Let's talk Proxy for SQL Server. (Not in terms of Internet access.) Typically, you'll run into this type of problem w…
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example combined with referencing Books Online, show some of the information that can be extracted from the Catalog Views in SQL Server.
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.

705 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

15 Experts available now in Live!

Get 1:1 Help Now