Solved

kill a SPID if taking more than % of resources

Posted on 2010-09-16
14
339 Views
Last Modified: 2012-08-13
is it possible to configure to automatically kill a SPID if it is taking x amount of processor or memory; is it possible to set it up that way?

thanks
0
Comment
Question by:anushahanna
  • 6
  • 4
  • 3
  • +1
14 Comments
 
LVL 60

Expert Comment

by:chapmandew
ID: 33691972
maybe...but why would you want to?
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692178
running a *huge* query in the off-peak hours, but don't want the system to off offline because off an overload, and more headaches in the morning...
0
 
LVL 10

Accepted Solution

by:
LMiller7 earned 84 total points
ID: 33692182
In general this would be a bad idea, even if it were possible. If a process is consuming too many resources this indicates a problem you should investigate. Killing it doesn't resolve the problem but merely sweeps it under the rug. This destroys all evidence of the problem and makes it more difficult to resolve.

Killing a process may improve the performance of a server. Small consolation when your users complain that they can't access the resources they require because an important process isn't running.
0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 333 total points
ID: 33692271
>> is it possible to configure to automatically kill a SPID if it is taking x amount of processor or memory

I wouldn't go with this approach.
Instead I would start

* Fine tuning the query or job that utilizes more processor or memory
* Create appropriate indexes

Note: Remember killing a job or spid affects your valuable DATA.
0
 
LVL 60

Expert Comment

by:chapmandew
ID: 33692319
describe huge query....
0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 333 total points
ID: 33692368
>> Note: Remember killing a job or spid affects your valuable DATA.

Instead you can cancel the execution of that query at all to reduce overall load to Server.
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692509
>> Note: Remember killing a job or spid affects your valuable DATA.

data integrity will not be affected, right? if not all successful, it will rollover?
0
NAS Cloud Backup Strategies

This article explains backup scenarios when using network storage. We review the so-called “3-2-1 strategy” and summarize the methods you can use to send NAS data to the cloud

 
LVL 60

Assisted Solution

by:chapmandew
chapmandew earned 83 total points
ID: 33692514
>>data integrity will not be affected, right? if not all successful, it will rollover?

Maybe, maybe not.  Depends on what you're doing and how you have it structured.
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692546
>>describe huge query....

one time need/operations that does the same thing over and over again-one i am working right, as an example..
@str = replace(@str, CHAR(32) + CHAR(32), CHAR(32)) in a loop
need to compare procs for any slight modification between test and dev environments. has taken hours and still executing..

just concerned about it..
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692558
>>Killing it doesn't resolve the problem but merely sweeps it under the rug.

I agree. But for occasional data checks that involve large processing of data, failure should not affect users. but i see your point. thanks.
0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 333 total points
ID: 33692708
>> Maybe, maybe not.  Depends on what you're doing and how you have it structured.

well said, chapman

>> need to compare procs for any slight modification between test and dev environments. has taken hours and still executing..

Try of some alternative logics instead of looping..
Posting the entire logic would help us give you some tuning tips..
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33692942
sure. thanks.

CREATE FUNCTION [dbo].[udf_cleaner] (@input VARCHAR(MAX))

RETURNS VARCHAR(MAX)

AS 

BEGIN

		DECLARE @result VARCHAR(max)

		SET @result = REPLACE(@input, CHAR(13), CHAR(32))

		SET @result = REPLACE(@result, CHAR(10), CHAR(32))

		SET @result = REPLACE(@result, CHAR(9), CHAR(32))

		WHILE PATINDEX('%  %', @result) > 0

		SET @result = REPLACE(@result, CHAR(32) + CHAR(32), CHAR(32))

		RETURN (@result)

END



select object_name(a.object_id,db_id('DB1')) from 

	(select m.* from DB1.sys.sql_modules m, DB1.sys.objects o where o.object_id = m.object_id and type = 'p') a

join

	(select m.* from DB2.sys.sql_modules m,DB2.sys.objects o where o.object_id = m.object_id and type = 'p') b

on

	object_name(a.object_id,db_id('DB1')) = object_name(b.object_id,db_id('DB2')) and 

	dbo.udf_cleaner(a.definition) <> dbo.udf_cleaner(b.definition)



DROP FUNCTION [dbo].[udf_cleaner]

Open in new window

0
 
LVL 57

Assisted Solution

by:Raja Jegan R
Raja Jegan R earned 333 total points
ID: 33693890
Seems like you are trying to find object Dependencies of a procedure.
And you can easily find it using SQL Dependency tracker tool given below

http://www.red-gate.com/products/SQL_Dependency_Tracker/
0
 
LVL 6

Author Comment

by:anushahanna
ID: 33749910
Thank you.
0

Featured Post

Is Your Active Directory as Secure as You Think?

More than 75% of all records are compromised because of the loss or theft of a privileged credential. Experts have been exploring Active Directory infrastructure to identify key threats and establish best practices for keeping data safe. Attend this month’s webinar to learn more.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

International Data Corporation (IDC) prognosticates that before the current the year gets over disbursing on IT framework products to be sent in cloud environs will be $37.1B.
I have a large data set and a SSIS package. How can I load this file in multi threading?
Viewers will learn how the fundamental information of how to create a table.
Viewers will learn how to use the INSERT statement to insert data into their tables. It will also introduce the NULL statement, to show them what happens when no value is giving for any given column.

864 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

23 Experts available now in Live!

Get 1:1 Help Now