Solved

sys.conversation_endpoints has huge amount of open conversations

Posted on 2014-02-16
3
130 Views
Last Modified: 2016-06-03
We are using MSSQL Standard 2008 R2

Our queue used to have zero items backed up, it would stay at zero and it was rare that I could ever catch it while it had an unprocessed item within it.

We just transferred our database (via mirroring) from our MSSQL Enterprise 2008 R2 to Amazon on MSSQL Standard 2008 R2.   Everything appeared to be working, until recently I've discovered that the queue has not been processed for 1 week due to not having a master key on the server.

I've added the master key, however my queue is still quite large and not decreasing at any kind of reliable rate.

If I select count(*) from MyQueue, I'm getting 1.5 Million items, I used to get zero.  Like I said I believe this got backed up from not having a valid running queue for a week.  I've since added a master key and restarted the queue by disable/enable but I don't really see any change.  It's gone done by 10K but isn't really flushing like I would expect.

If I select count(*) from sys.conversation_endpoints, I get 2M rows.

When I look at sys.conversation_endpoints at various positions, beg/mid/end, I see all of the items are in "CONVERSING" or "DISCONNECTED_INBOUND" state.



My question is, are these existing conversations *ever* going to flush or should I remove them from the queue since they were submitted in an invalid environment?  (And the best way to do that if that is the case)?

My biggest problem right now is that new items are not processed, at least not in a timely manner and I believe it is just due to the size of the queue.

------------------------------------------------------------

I've obfuscated some of the names, but here are the three main things I would think you would need to see code wise.

Here is the definition of the queue
ALTER QUEUE [dbo].[MyQueue] WITH STATUS = ON , RETENTION = OFF , ACTIVATION (  STATUS = ON , PROCEDURE_NAME = [fp].[spProcessMyQueue] , MAX_QUEUE_READERS = 32 , EXECUTE AS N'dbo'  ), POISON_MESSAGE_HANDLING (STATUS = OFF) 

Open in new window



The SP that processes a message
ALTER PROC [fp].[spProcessMyQueue]
AS
DECLARE @xml xml
	,@conversation_handle UNIQUEIDENTIFIER
	,@message_type_name varchar(100)
	,@ErrorMsg varchar(500)



WHILE(1=1)
BEGIN
	
	
	SET @ErrorMsg = ''


	WAITFOR(
	RECEIVE TOP(1)
		@xml = CASE WHEN validation = 'X'
							THEN CAST(message_body AS XML)
							ELSE CAST(N'<none/>' AS XML)
							END
		,@conversation_handle = conversation_handle
		,@message_type_name = message_type_name
	FROM dbo.MyQueue
	), TIMEOUT 1000;


	END CONVERSATION @conversation_handle WITH CLEANUP
 BEGIN TRY


..............Processing of the message occurs here. 
Contains lookups, some updates/inserts

		END TRY
		BEGIN CATCH  
			--insert into error log table
		END CATCH 
	END

Open in new window


This is how I'm inserting messages into the queue.
ALTER PROC [fp].createMyQueueMessage @xml XML
AS
	DECLARE @Conversation_Handle UNIQUEIDENTIFIER

	IF @xml IS NOT NULL
	BEGIN
		
		BEGIN TRY 
			IF CAST(@xml as varchar(4000)) LIKE '%aid="370"%'
 
			BEGIN DIALOG CONVERSATION @Conversation_Handle
			FROM SERVICE [MyService]
			TO SERVICE 'MyService'
			ON CONTRACT [MyContractType]
			WITH ENCRYPTION = OFF;

			SEND ON CONVERSATION @Conversation_Handle
			MESSAGE TYPE [MymessageType] (@xml);
		END TRY
		BEGIN CATCH
			--insert into error log table
					
		END CATCH	
	END  
	

Open in new window

0
Comment
Question by:kzsigo
  • 2
3 Comments
 

Author Comment

by:kzsigo
ID: 39862749
I did see this post:
http://www.experts-exchange.com/Microsoft/Development/MS-SQL-Server/SQL-Server-2005/Q_23105681.html

Which links to:
http://www.sqlteam.com/article/centralized-asynchronous-auditing-across-instances-and-servers-with-service-broker

This shows me how to end the conversations if that is the right direction to go to unclog things.  I really want those items to be processed, so I would probably iterate through them all and close them, but also resubmit them if now that I have it setup it will keep up.
0
 

Accepted Solution

by:
kzsigo earned 0 total points
ID: 39864962
To anyone who has a similar issue, I created a new service/queue/processor triplet and began sending new items into the new queue.

My new queue is processing items immediately with zero latency.  No new items are being added to the old and I can see it's count going to zero slowly but surely.  It is very slow, about 1 / second and I have about 1M to go.  I'm not real worried about them, I'm willing to bite the bullet and just let it finish in 11 days.  My new items are being processed just fine now so the emergency is over at least.

I don't really want to accept this as the solution because my original question was regarding the best way to do this and I don't know if what I have done has really solved that...
0

Featured Post

What Should I Do With This Threat Intelligence?

Are you wondering if you actually need threat intelligence? The answer is yes. We explain the basics for creating useful threat intelligence.

Join & Write a Comment

Nowadays, some of developer are too much worried about data. Who is using data, who is updating it etc. etc. Because, data is more costlier in term of money and information. So security of data is focusing concern in days. Lets' understand the Au…
The Delta outage: 650 cancelled flights, more than 1200 delayed flights, thousands of frustrated customers, tens of millions of dollars in damages – plus untold reputational damage to one of the world’s most trusted airlines. All due to a catastroph…
Via a live example, show how to setup several different housekeeping processes for a SQL Server.
Viewers will learn how the fundamental information of how to create a table.

760 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

23 Experts available now in Live!

Get 1:1 Help Now