Solved

Returning a result cursor into comma delimited string?

Posted on 2003-11-17
7
294 Views
Last Modified: 2006-11-17
I have a stored procedure and in it, I get a cursor of about 100+ rows(only one column).
I need to concatenate each row's value into a comma delimited string. i.e. "210, 211, 212, 213.."
Is there existing method that does that?
If not, what is the data type I can use to ensure that the length of the string does not go out of bounds.
so if I declare a variable strDelimited, what data type can I use it so that I can insure its length will not be exceeded if cursor returns like 1000+ rows?

Thanks
0
Comment
Question by:StriderX
  • 2
  • 2
  • 2
  • +1
7 Comments
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 9767769
The maximum varchar is 8000 bytes.  If you think you are going to exceed that than perhaps you should do it from the front end.

If you can post your code it shoul help.

Anthony
0
 

Author Comment

by:StriderX
ID: 9768119
DECLARE @strBFailed varchar(4000)

DECLARE @strInd varchar(10)
Set @strBFailed = ""


--FETCH ALL ROWS THAT DOES NOT MATCH WITH THE CURRENT ONE!

DECLARE curBFailed CURSOR FOR
SELECT
cast(DETAILID as varchar(10))
FROM
DETAIL b
where
status = 'Pending' and ProcessTime = @dtmTimeStamp

OPEN curBFailed
FETCH NEXT FROM curBFailed INTO @strInd
WHILE @@FETCH_STATUS = 0
BEGIN
      Set @strBFailed = @strBFailed + @strInd + CHAR(13)
      FETCH NEXT FROM curBFailed INTO @strInd
END
CLOSE curBFailed


That's the code.
How do I or what data type can I use to ensure that strBFailed doex not get exceed varchar(4000).
So basically I don't know how many rows curBfailed will return and I need to set it so that unliimited number of rows will not result in @strBFailed crashing b/c it's run out of space to hold characters.

Thanks
0
 
LVL 75

Expert Comment

by:Anthony Perkins
ID: 9768160
What about changing:
DECLARE @strBFailed varchar(4000)

To:
DECLARE @strBFailed varchar(8000)

If that is not sufficient, you will have to do it in the front end.

Anthony
0
PRTG Network Monitor: Intuitive Network Monitoring

Network Monitoring is essential to ensure that computer systems and network devices are running. Use PRTG to monitor LANs, servers, websites, applications and devices, bandwidth, virtual environments, remote systems, IoT, and many more. PRTG is easy to set up & use.

 
LVL 3

Expert Comment

by:xmstr
ID: 9771845
You can do it the way StriderX recommends and it will work, but the performance will be lacking because of the use of a cursor.  The same thing can be accomplished without a cursor and will be much faster (see example below).  The string length restriction described by acperkins still applies.

declare @str varchar(8000)    
set @str = ''

select @str = @str + isnull(cast(UserNbr as varchar), '') + ','
from tblUser

-- Trim off last comma
if (right(@str, 1) = ',')
      set @str = left(@str, len(@str) - 1)

print @str

-x
0
 
LVL 69

Accepted Solution

by:
ScottPletcher earned 50 total points
ID: 9774514
xmstr looks right to me.  In this particular situation you probably also want to add a length check to make sure you don't waste time processing rows that don't fit into your string anyway, for example (and going back to 4000 -- who could read more than that anyway?):

DECLARE @ids VARCHAR(4000)
SET @ids = ''

SELECT @ids = @ids + ISNULL(CAST(detailId AS VARCHAR(10)) + ','
FROM detail
WHERE status = 'Pending' AND ProcessTime = @dtmTimeStamp
AND LEN(@ids) <= 3990

SET @ids = LEFT(@ids, LEN(@ids) - 1)
0
 
LVL 69

Expert Comment

by:ScottPletcher
ID: 9774535
Oops, the check before the LEFT is needed:

IF @ids <> ''
    SET @ids = LEFT(@ids, LEN(@ids) - 1)
0
 
LVL 3

Expert Comment

by:xmstr
ID: 9779162
I appreciate you giving someone else credit for the exact same answer I gave you!  Thanks alot! -x
0

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

JSON is being used more and more, besides XML, and you surely wanted to parse the data out into SQL instead of doing it in some Javascript. The below function in SQL Server can do the job for you, returning a quick table with the parsed data.
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Using examples as well as descriptions, and references to Books Online, show the different Recovery Models available in SQL Server and explain, as well as show how full, differential and transaction log backups are performed
This videos aims to give the viewer a basic demonstration of how a user can query current session information by using the SYS_CONTEXT function

930 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

8 Experts available now in Live!

Get 1:1 Help Now