Solved

"Time-out expired"  Error

Posted on 2006-07-06
11
212 Views
Last Modified: 2010-05-18

Hi there!

During the execution of an SQL Server 2005 View (locally) I got this error message.  When I put a restriction like

"Where TableName.FieldName !> 100"  

everything works fine.

01. How can I (can I?) alter the time-limitation for running scripts, please?

02. Considering your (expert) experience, what value is still "not big enough" considering I intend to run all this at the Server, fired (and populated)  by ASP.NET oriented website (nothing local) ?

Thanks,
fskilnik.


0
Comment
Question by:fskilnik
  • 6
  • 4
11 Comments
 
LVL 75

Assisted Solution

by:Aneesh Retnakaran
Aneesh Retnakaran earned 100 total points
ID: 17051972
Do you have an index on 'FieldName'

Also use the TSQl syntax
"Where TableName.FieldName <> 100"  
0
 
LVL 7

Expert Comment

by:ExpertAdmin
ID: 17052121
The first solution should be (as aneeshattingal suggested) improving efficiency. With web applications anything over 6 seconds is a site killer and the default timeout should be much higher than that. So if it is timing out on you, the best thing to do is make it more efficient.

If you could post your view code here we may be able to suggest ways of improving execution time.

M@
0
 

Author Comment

by:fskilnik
ID: 17052636

Thanks, Aneesh and M@ !

Aneesh: I will read about "index" at my books now. (I am developing my first SQL Server database...)

M@:  Sure!  I would be glad if you (both!) could tell me about what should be done different...  Here it goes (below) !

Thanks a lot,
Fábio.

View Code:

SELECT DISTINCT
                      TOP (100) PERCENT Gen.StockID, Gen.EqtyTransID, Gen.TransDate, dbo.LastNegDate(Gen.StockID) AS LastNegDate, Gen.NegAccumSum,  Gen.NegPreviousAccum, dbo.LastPositDate(Gen.StockID) AS LastPositDate, Gen.PositAccumSum, Gen.PositPreviousAccum,                      
                      dbo.NegAccumQtyShares(dbo.LastNegDate(Gen.StockID), Gen.StockID) AS FinalAccumNegShares,
                      dbo.PositAccumQtyShares(dbo.LastPositDate(Gen.StockID), Gen.StockID) AS FinalAccumPositShares
FROM     dbo.EqtyTransactionsGeneral AS Gen RIGHT OUTER JOIN
                      dbo.EqtyTransactions_auxNeg AS Neg ON Gen.StockID = Neg.StockID RIGHT OUTER JOIN
                      dbo.EqtyTransactions_auxPosit AS Posit ON Gen.StockID = Posit.StockID
ORDER BY Gen.StockID, Gen.TransDate


OBS.:  dbo.NegAccumQtyShares and dbo.PositAccumQtyShares  are functions.
0
Simplifying Server Workload Migrations

This use case outlines the migration challenges that organizations face and how the Acronis AnyData Engine supports physical-to-physical (P2P), physical-to-virtual (P2V), virtual to physical (V2P), and cross-virtual (V2V) migration scenarios to address these challenges.

 

Author Comment

by:fskilnik
ID: 17052668

Important: these functions are (I guess) the "time-consuming creatures"... as shown below:

CREATE FUNCTION [dbo].[NegAccumQtyShares](@lastInclDate datetime, @StockID int)
RETURNS int
AS
BEGIN
RETURN (SELECT Sum(SgnQtyShares) FROM dbo.EqtyTransactions_aux
Where (SgnQtyShares <0) AND (TransDate !> @lastInclDate) AND (StockID = @StockID))
END

The Posit one is almost identical (sure).

Thanks!
Fábio.
0
 
LVL 7

Accepted Solution

by:
ExpertAdmin earned 300 total points
ID: 17052754
OK...

A couple of things. RIGHT OUTER JOIN is a real, real bad thing unless it is absolutely required. What this tells it is give me a list of every record from both tables. If this is the desired action, then it is OK, but if not it can lead to pretty heavy server loading.

Indexing will probably help. These appear to be transactional tables which typically have a lot of row.

Keep in mind that a view is a way of pulling data together to make it more usable and will typically have some criteria applied against it when a SELECT is run. The speed of the view itself is not as important as the speed that a SELECT against it returns. That is because a lot of the time that it is taking to run the view is actually taken up by the development tool (for example, Enterprise Manager) to load that many records in memory and display them.

A better way to test the view's speed is to do something like this:

SELECT COUNT(*) FROM MyView

This will return the number of records but will not tie up the overhead in trying to display them.

Indexes should be added for the following columns:

EqtyTransactionsGeneral.StockID
EqtyTransactions_auxNeg.StockID
EqtyTransactions_auxPosit.StockID
EqtyTransactionsGeneral.TransDate

Also, any columns referenced in a WHERE clause will benefit from indexing.

Before you just add indexes to everything though, you need to understand that indexes are not free. They speed up SELECT statements, but they slow down INSERTS since the new records have to be added to the index when they are created. So if these tables are bulk updated , adding indexes will have a negative impact on that.

M@



0
 

Author Comment

by:fskilnik
ID: 17052872
Thanks for the very interesting and complete answer, M@ !

I will re-consider the whole structure to try to avoid the RIGHT OUTER JOIN and use the usual JOIN, for sure.


>> These appear to be transactional tables which typically have a lot of row.
Exactly.

>> Keep in mind that a view is a way of pulling data together to make it more usable and will typically have some criteria applied against it when a SELECT is run.

I am using it to be able to "chain" it, I mean: I create a View then I create another one, "on top of it" , calculating for instance the SUM of all values of a field that was created (by formula) in the previous View, etc etc".  This is the proper way to do this, isn´t it?  I mean, it seems to me that I am doing something analogous to query some queries in Access. Is "my way" (explained) the "normal" way in SQL Server to do that ?

>> A better way to test the view's speed is to do something like this:  SELECT COUNT(*) FROM MyView

UAU!  Very nice and I would not imagine that most of the time would be to "show" the "answers", not calculating them!!

>> About indexes:  thanks for the proper places to put them, thanks for the warning. I will read about them and try to implement them accordingly!  

(Our "work load" is much more in SELECTing things than INSERTing things... good!  But I will try to make it as clean and clear as possible, sure)

Thanks a lot for your quick and throughout assistance, M@.

Fábio.



0
 
LVL 7

Expert Comment

by:ExpertAdmin
ID: 17053221
Yes, that is a decent use of the view. I think you are on the right track.

M@
0
 

Author Comment

by:fskilnik
ID: 17054163

Great, M@ !  

I will do the following: increase 300 to 400 points and split it to M@ ("Accepted", 300 pts) and Aneesh ("Assisted", 100 pts.).  I guess it´s fair and I hope both of you like it.

Thanks a lot,
Fábio.
 
0
 

Author Comment

by:fskilnik
ID: 17054195

(I´m taking into account the fact that Aneesh helped me in another question after the question was closed, therefore I could not reward him then.)
0
 
LVL 7

Expert Comment

by:ExpertAdmin
ID: 17054199
Sounds good. Good luck.

M@
0
 

Author Comment

by:fskilnik
ID: 17102090
Hi M@ !

Would you mind checking my "next phase database project question" below, please?

http://www.experts-exchange.com/Databases/Microsoft_SQL_Server/Q_21918495.html

Your expertise in the matter would be most welcomed, if you have little time to have a look!


 Thanks a lot,

 Fábio.
 
0

Featured Post

Optimizing Cloud Backup for Low Bandwidth

With cloud storage prices going down a growing number of SMBs start to use it for backup storage. Unfortunately, business data volume rarely fits the average Internet speed. This article provides an overview of main Internet speed challenges and reveals backup best practices.

Question has a verified solution.

If you are experiencing a similar issue, please ask a related question

Suggested Solutions

Title # Comments Views Activity
T-SQL: "HAVING CASE" Clause 1 25
SQL Error - Query 6 26
transaction in asp.net, sql server 6 33
sqlserver get datetime field and create a string 5 18
Why is this different from all of the other step by step guides?  Because I make a living as a DBA and not as a writer and I lived through this experience. Defining the name: When I talk to people they say different names on this subject stuff l…
Slowly Changing Dimension Transformation component in data task flow is very useful for us to manage and control how data changes in SSIS.
Via a live example, show how to extract insert data into a SQL Server database table using the Import/Export option and Bulk Insert.
Using examples as well as descriptions, and references to Books Online, show the documentation available for datatypes, explain the available data types and show how data can be passed into and out of variables.

809 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question