1) We have a application where from front end, 30 values are input then together with current date & timestamp, 30 rows should be inserted in a table.

2) This can be 30, 60, 200 etc. i.e. any number of inputs & hence any number of records to be inserted in table.

3) char & varchar datatypes is max 255 characters. text is big & okay for us but cannot be passed as a stored procedure parameter.

4) We are trying to send a concatenated string including values of all these 30 etc. inputs from frontend. This we will tokenize in stored proc & carry out the inserts. Is this possible?

4) Are there any array type datatypes in Sybase.  Is there any other way of doing this in a stored procedure.
LVL 17
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

If i understand what you are saying correctly, you just want to do multiple inserts from user-defined values. Why are you trying to force a single stored proc to do this. You should use a loop within your application language to do the inserts one at a time. This is the most clean implementation

Using string concatenation to pass the values into a stored proc as a single input, then parse it out within the sp, and loop through them for insertion is a complete nightmare. Not to mention there is a size limitation on input parameters for the sp.  This should be done on the application level, not database.

The only array datatype structure i am aware of is cursor which is not applicable here. The only way i know that allows you to do mass inserts in sybase is:  insert into TableOne select Cols.. from TableTwo.  This is very useful but only if the data already exist in db table somewhere. If it is passed to you from frontend gui/forms, there is no way around it. Just loop through multiple insert statements.


Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
Jan FranekCommented:
Ad 3) - in newer versions of ASE (I think 12.5 or later) char & varchar datatypes can have up to 16384 characters

Ad 4) - yes it is possible - using charindex and substring functions it can be done quite easily, but I'm not sure how effective is string handling in T-SQL (compared to C)

Ad 5) - no, there's no array datatype in Sybase ASE
On 12.5.x, the maximum size of a char/varchar is increased ~16k, so you could pass quite a few values (but not an unlimited number) all concatenated together in one string.

But I have to agree, I don't think handling the values that way is a good idea, it seems much cleaner to just have the application do one insert for each value submitted.
Joe WoodhousePrincipal ConsultantCommented:
You've already been given good advice, and I agree with what the others have said. There's little business or technical reason to want to handle multiple inserts like this in a single procedure call.

You could place all the rows to be inserted in a file, or something that looks like a file (a UNIX named pipe, for example), and call bcp to do a bulk insert to the table in a single operation. This is faster than row by row inserts - "slow" bcp will be the same speed as INSERT ... SELECT, and "fast" bcp is roughly equivalent to SELECT INTO in performance. But you'll lose time in exporting the rows from your app to a file, and cleaning up the file afterwards. This has to be done outside of ASE, which means you lose ASE transaction management and error handling.

It *is* possible to to a "bulk array insert" directly from a C array structure in your app into a Sybase table, using the Bulk Library functions in Open Client. This is basically equivalent to doing a bcp from a file into a table. This is complicated and unless you're talking hundreds of thousands of rows that has to be loaded as quickly as possible (this approach *will* outperform row by row inserts in the same way that fast bcp does), it probably isn't worth the coding and testing effort.

Nah, stick with row by row inserts from your app. If a single-threaded approach isn't fast enough, look into multiple connections each doing a subset of the inserts. This will introduce locking and concurrency problems which you'd need to fix with partitioning the table and/or row-level locking - both of which introduce other issues.
the trick here is not to have one proc call that does 30 inserts (because as every has already said: that is just too hard and silly), but to actually batch 30 EXEC procedure statements into one string and the pass that string to sybase as if it was a single statement.

this is much more efficient than 30 seperate calls on sybase (especially over a WAN)

NOTE: you may have to increase your buffer sizes for larger strings to be passed.

if this is not a matter of improving performance and it is just about batching commands that belong together, then you should be looking at "transaction control" techniques.
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Sybase Database

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.