Link to home
Start Free TrialLog in
Avatar of BNBJ
BNBJFlag for United Kingdom of Great Britain and Northern Ireland

asked on

SQL 2000 Memory Leak?

I'm running a copy of SQL Server 2000 SP3 on Windows NT4 SP6a on a dual 800MHz with 1GB RAM.

I'm running a busy web application which uses ASP to talk to a VB COM object which in turn connects to SQL.  It handles relatively large amounts of data and can be quite processor intensive.  The main problem I have seems to be due to a memory leak.  When the application is running some of its larger methods you can watch the memory go up in the task manager on the server.  When the method finshes the memory doesn't seem to be released.  This can consume 300 to 400 MB of memory during just one call to the COM object.  I've triple checked the ASP and COM object for non-destroyed objects and am 100% sure that everything is being cleaned up correctly.

To test SQL I have written a DTS package in VBScript to transfer data from one database to another on the same server.  I have intentionally made the row size quite big to simulate the functionality in the web app.  The field datatypes are:

int (4)
uniqueidentifier (16)
int(4)
nvarchar (1024)
nvarchar (1024)
nvarchar (1024)
nvarchar (128)
nvarchar (1024)
nvarchar (1024)
bit (1)

The table has well over 100000 records so I've set the DTS to batch 1000 at a time to test the memory.
The DTS uses an ActiveX Script containing 4 ADO objects (2 connections and 2 recordsets).  These are created and opened at the start and closed and destroyed at the end.

Each batch of 1000 records transferred takes about 13MB more memory!?  This memory never seems to be released from SQL so it eventually runs out and paging starts.

Does anyone know how I can force this memory to deallocate or what I might be doing wrong.

Looking forward to your replies!

Avatar of jSkipB
jSkipB

Perhaps this may be worthwhile:

You may notice that, after the very first couple of iterations, the problem doesn't seem to be there, but rather occurs after, say, the first 3-5K records have been processed.

If it doesn't cause integrity problems, after each 1000 rows, commit any pending transactions, then give SQL a chance to "calm down" (you may even have to close and reopen your data connection). This applies especially if implicit transactions are turned on.
ASKER CERTIFIED SOLUTION
Avatar of TheSpirit
TheSpirit

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
TheSpirit is right with his post.  I assume you're not using AWE and you have SQL Server configured to dynamically allocate memory?

Brett
I have a question for you...

What if I want to let MSSQL use as much memory as it wants, but I would like to get it back when the peak is done...

I have the same behavior than BNBJ and I ocasionnaly turn off and on my server to free up some memory.

By the way, I had a server on a computer with 256 megs RAM and MSSQL was always taking as much as it could...when we upgraded to a computer with 1.8gig of RAM, MSSQL did the same thing with nearly 1.7gig of memory for itself...

Avatar of BNBJ

ASKER

Thanks TheSpirit.  Not the answer I wanted but if that's the way it is then fair enough.

If I want to isolate the web app's database with no detrimental effect on the other databases on the webserver, would I be best setting up a 2nd instance of SQL and putting the web app database in there?  I could then tell each instance how much memory it can have.

The reason I say this is as the server slows down and memory increases I tend to get a problem with ASP and SQL.  I either get an 'HTTP 1.1' message or '[DBNETLIB] SQL Server does not exist or access denied'.  This then means I need to reboot.

Would controlling the memory useage within SQL help with this and reduce the problem?

Also, whats the SP for limiting memory?

Thanks everyone.
Avatar of BNBJ

ASKER

I've just read in Books Online the following paragraph...

"There is a short delay between the start of a new application and the time SQL Server releases memory. Using max server memory prevents this delay and may give better performance to the other application. Only set min server memory if the start time of new applications sharing the same server as SQL Server shows up as a problem. It is better to let SQL Server use all of the available memory."

Do you reckon by setting the max server memory this would then prevent the '[DBNETLIB] SQL Server does not exist or access denied' message as the remaining memory could be accessed quicker by IIS?
Setting the max memory would probably help you since there would be no delay waiting for SQL to give up the memory it took.
Avatar of BNBJ

ASKER

What about setting up 2 instances of SQL?

Any benefit or would it just be an increased overhead?
Setting up a second instance takes more resources on the server--you're escentially running 2sql server process now....I've used them before, but I haven't really seen anything good from them (you're suppose to not affect the other instance, but I've had an instance lock up the whole server and all you could do is stop and start ALL sql servers)....

Brett
Avatar of BNBJ

ASKER

Thanks everyone for your comments.

TheSpirit,

Such a simple solution in the end.  I've set the max server memory and kept just the one instance and so far evrything seems fine.

Thanks