Want to win a PS4? Go Premium and enter to win our High-Tech Treats giveaway. Enter to Win

x
?
Solved

kill a SPID if taking more than % of resources

Posted on 2010-09-16
14
Medium Priority
?
364 Views
Last Modified: 2012-08-13
is it possible to configure to automatically kill a SPID if it is taking x amount of processor or memory; is it possible to set it up that way?

thanks
0
Comment
Question by:anushahanna
[X]
Welcome to Experts Exchange

Add your voice to the tech community where 5M+ people just like you are talking about what matters.

  • Help others & share knowledge
  • Earn cash & points
  • Learn & ask questions
  • 6
  • 4
  • 3
  • +1
14 Comments
 
LVL 60

Expert Comment

by:chapmandew
ID: 33691972
maybe...but why would you want to?
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692178
running a *huge* query in the off-peak hours, but don't want the system to off offline because off an overload, and more headaches in the morning...
0
 
LVL 10

Accepted Solution

by:
LMiller7 earned 336 total points
ID: 33692182
In general this would be a bad idea, even if it were possible. If a process is consuming too many resources this indicates a problem you should investigate. Killing it doesn't resolve the problem but merely sweeps it under the rug. This destroys all evidence of the problem and makes it more difficult to resolve.

Killing a process may improve the performance of a server. Small consolation when your users complain that they can't access the resources they require because an important process isn't running.
0
Concerto Cloud for Software Providers & ISVs

Can Concerto Cloud Services help you focus on evolving your application offerings, while delivering the best cloud experience to your customers? From DevOps to revenue models and customer support, the answer is yes!

Learn how Concerto can help you.

 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 1332 total points
ID: 33692271
>> is it possible to configure to automatically kill a SPID if it is taking x amount of processor or memory

I wouldn't go with this approach.
Instead I would start

* Fine tuning the query or job that utilizes more processor or memory
* Create appropriate indexes

Note: Remember killing a job or spid affects your valuable DATA.
0
 
LVL 60

Expert Comment

by:chapmandew
ID: 33692319
describe huge query....
0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 1332 total points
ID: 33692368
>> Note: Remember killing a job or spid affects your valuable DATA.

Instead you can cancel the execution of that query at all to reduce overall load to Server.
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692509
>> Note: Remember killing a job or spid affects your valuable DATA.

data integrity will not be affected, right? if not all successful, it will rollover?
0
 
LVL 60

Assisted Solution

by:chapmandew
chapmandew earned 332 total points
ID: 33692514
>>data integrity will not be affected, right? if not all successful, it will rollover?

Maybe, maybe not.  Depends on what you're doing and how you have it structured.
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692546
>>describe huge query....

one time need/operations that does the same thing over and over again-one i am working right, as an example..
@str = replace(@str, CHAR(32) + CHAR(32), CHAR(32)) in a loop
need to compare procs for any slight modification between test and dev environments. has taken hours and still executing..

just concerned about it..
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692558
>>Killing it doesn't resolve the problem but merely sweeps it under the rug.

I agree. But for occasional data checks that involve large processing of data, failure should not affect users. but i see your point. thanks.
0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 1332 total points
ID: 33692708
>> Maybe, maybe not.  Depends on what you're doing and how you have it structured.

well said, chapman

>> need to compare procs for any slight modification between test and dev environments. has taken hours and still executing..

Try of some alternative logics instead of looping..
Posting the entire logic would help us give you some tuning tips..
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692942
sure. thanks.

CREATE FUNCTION [dbo].[udf_cleaner] (@input VARCHAR(MAX))
RETURNS VARCHAR(MAX)
AS 
BEGIN
		DECLARE @result VARCHAR(max)
		SET @result = REPLACE(@input, CHAR(13), CHAR(32))
		SET @result = REPLACE(@result, CHAR(10), CHAR(32))
		SET @result = REPLACE(@result, CHAR(9), CHAR(32))
		WHILE PATINDEX('%  %', @result) > 0
		SET @result = REPLACE(@result, CHAR(32) + CHAR(32), CHAR(32))
		RETURN (@result)
END

select object_name(a.object_id,db_id('DB1')) from 
	(select m.* from DB1.sys.sql_modules m, DB1.sys.objects o where o.object_id = m.object_id and type = 'p') a
join
	(select m.* from DB2.sys.sql_modules m,DB2.sys.objects o where o.object_id = m.object_id and type = 'p') b
on
	object_name(a.object_id,db_id('DB1')) = object_name(b.object_id,db_id('DB2')) and 
	dbo.udf_cleaner(a.definition) <> dbo.udf_cleaner(b.definition)

DROP FUNCTION [dbo].[udf_cleaner]

Open in new window

0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 1332 total points
ID: 33693890
Seems like you are trying to find object Dependencies of a procedure.
And you can easily find it using SQL Dependency tracker tool given below

http://www.red-gate.com/products/SQL_Dependency_Tracker/
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33749910
Thank you.
0

Featured Post

Learn Veeam advantages over legacy backup

Every day, more and more legacy backup customers switch to Veeam. Technologies designed for the client-server era cannot restore any IT service running in the hybrid cloud within seconds. Learn top Veeam advantages over legacy backup and get Veeam for the price of your renewal

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

One of the most important things in an application is the query performance. This article intends to give you good tips to improve the performance of your queries.
This month, Experts Exchange sat down with resident SQL expert, Jim Horn, for an in-depth look into the makings of a successful career in SQL.
Familiarize people with the process of retrieving data from SQL Server using an Access pass-thru query. Microsoft Access is a very powerful client/server development tool. One of the ways that you can retrieve data from a SQL Server is by using a pa…
Viewers will learn how to use the SELECT statement in SQL to return specific rows and columns, with various degrees of sorting and limits in place.

636 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question