SQL Server and Array for INSERT INTO or other method to quicken running

I have rather simple stored procedure, being called many, many times from C# application using XSD.
I think if I called it once using array it would be quickly. The database and code are different server so it is taking very long to run code

INSERT INTO [dbo].[myTable] ([a], b, [c]) VALUES (@a, @b, @b);

so include of calling procedure say 5 times like this

I'd like to call with array (  "1,2,3,4,5",  "2,2,2,2,2", "10,20,30,40,50")

I think this would be quicker as only have to authorise once etc.

Please advise, thank-you
Who is Participating?

[Product update] Infrastructure Analysis Tool is now available with Business Accounts.Learn More

I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

Vitor MontalvãoMSSQL Senior EngineerCommented:
Why do you think that? Internally it will be single INSERTs anyway.
The only way to insert massive data faster is to use BULK INSERT command.
ste5anSenior DeveloperCommented:
The correct approach would be using table-valued parameters.

Using CSV strings is imho not quicker. Cause parsing those strings involves string splitting, which is imho a slower operation.

The much simpler solution: just executing those five calls in one batch.
rwallacejAuthor Commented:
thoughts were that only calling procedure once be quicker than many many times?

database is in cloud
and application is locally
so lots of calls to same procedure
Protecting & Securing Your Critical Data

Considering 93 percent of companies file for bankruptcy within 12 months of a disaster that blocked access to their data for 10 days or more, planning for the worst is just smart business. Learn how Acronis Backup integrates security at every stage

rwallacejAuthor Commented:
so can you give example of table valued parameters with my columns and code please ?
Scott PletcherSenior DBACommented:
Inserting all the rows will be quicker regardless, esp. up to a size of ~60K bytes, as it will reduce the SQL INSERT and trans log effort. (A single 10x INSERT will always be faster than ten 1x INSERTs.)

As to string or table param, for anything under ~1000 rows, you could either pass them as a table or as a string, as long as you split the string using a very efficient string splitter, such as delimitedsplit8k.  Performance should be better either way.  But, if you use an inefficient splitter, then, yes, that may slow down the operation noticeably.

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Why don't you use this:
INSERT INTO [dbo].[myTable] 
    ([a], b, [c]) 
     [a], b, [c] 
    ) t([a], b, [c])

Open in new window

It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Microsoft SQL Server 2008

From novice to tech pro — start learning today.