Link to home
Start Free TrialLog in
Avatar of jmarbutt
jmarbutt

asked on

Slow Performing ASP.net 2.0 Application

We have several sites (about 50) running on the same asp.net 2.0 application under different web sites but sharing the same pool and folder.

When we upload a new version or recycle IIS all the sites perform very very slow. We have the sites highly optimized sql calls and every other area but since we have upgraded from 1.1 to 2.0 we have noticed this as a problem. Once a site runs for the first time it seems to be fine for a while but it can take 5-30 minutes for a site to get up to speed. We do not have any problems on our development machine like this.

So my question is:

1. Should we precompile?
2. If we precompile should we do something special for every site so that the IIS meta data is correct?
3. Is there a problem because they share the same application folder?
4. Is there a problem because they share the same application pool?

Thanks
Avatar of surajguptha
surajguptha
Flag of United States of America image

Have you tried monitoring the CPU utilization / memory usage when you notice this slowness? May be it is just that there are too many calls serviced?
Avatar of jmarbutt
jmarbutt

ASKER

CPU stays pretty much around 30% and Memory usage is about 1.28 gigs of 2 gigs.

This didn't start happening until we upgraded to asp.net 2.0.
you mean you just started running the web app that was running in 1.1 to 2.0 in the ASP .net settings of the virtual directory?
No I just upgraded our code from 1.1 to 2.0.
ASKER CERTIFIED SOLUTION
Avatar of McExp
McExp
Flag of United Kingdom of Great Britain and Northern Ireland image

Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
do u see this problem for the same page or is the performance slow only for the first access of a page ... but then you seem to have noticed this happned only after you coded for 2.0 .. there is nothing in 2.0 specifically that will make a site run slow .. or there is nothing specific that you need to do in 2.0 to make a site run faster ... there must be some other issue .. did you check the event viewer for any waninngs or errors?

Rejo
The first request on every site is very slow. So if they are all using the same application and web site 1 compiles after first access why does web site 2 which is in the same folder and application pool take so long to start?
>>So if they are all using the same application and web site 1 compiles after first access why does web site 2 which is in the same folder and application pool
what do you mean by "web site 2 which is in the same folder"? do you mean "web page"? first request to a webpage in any application will be slower than subsequent requests to the same page ..

Rejo
Ok lets take a couple steps back. Here is our application is setup:

1. One folder on e:\web\
2. Only one application in there
3. Approximately 50 IIS sites pointing to the folder e:\web
4. 1 Application Pool for low profile sites and 1 for each higher profile site(only about 2 on this server)


This is CMS application that we developed in house that generates the sites based on their httpheaders. And all point to the same database. Most of the sites are low traffic and have never been a problem until we converted to asp.net 2.0.  

Let me know if you have any other questions
I might be wrong here .. but I understand that even if you have the multiple application pointing to the same folder and set up as different application/virtual directory will have its own "context" and so will be considered as a different application .. if this was not the case, the application, cache objects etc would be shared between 2 application .. you can probably test this by adding a small code like counting users or requests in the global.asax file - application_onstart event .. see if it increments everytime you access a different application .. if it does increment, then my assumption is wrong ..

Rejo
I believe you are correct but this gets back to the precompilation question, if we precompile do we have to something special for every site?

I have created a test.aspx in our application that all it does is a Response.Write("test") and if the individual website has not been accessed it can take about 5 minutes to compile. Once it has compiled for this site if we goto another site it will take the same amount of time and basically recompile for each site.

I feel like it might be best to separate each site because I feel like the sites are fighting with each other to compile. While this creates more work for us but hopefully will make things run smoother. I just wanted to confirm with someone before doing this or see if anyone else had run into something like this.
I am not sure how you can precompile? I assume you have deployed a compiled website ( i mean the dll and not the .vb or .cs file) .. and what we are discussing here about JIT compilation ..
what is the application pool properties set as .. i mean in the performance tab of the application pool properties, what is the idle time? is it too low? are you recycling very often? have you changed the defaults in the "recycling" tab?

Rejo
Here are the settings from the application pool:

Recycle worker process  360
Recycle work process at the following times 00:00

Maximum used memory 500

Performance:
Idle TImeout: 20
Request queue limit 4000

Maximum number of worker processes 1

there seems to nothing here that may be increasing the frequency of the worker thread recycle .. so back to pre-compilation .. what do you mean by that? I hope you have already done this .. http://www.odetocode.com/Articles/417.aspx ..

Rejo
I tried that but I am getting:

"This is a marker file generated by the precompilation tool, and should not be deleted!"

When we try to access pages via our http handler. We actually have our cms to take a file name like Home.aspx and look it up in their database to show it and that file doesn't actually exist. So how do we allow it to handle that?
where are you geting that error?
>>We actually have our cms to take a file name like Home.aspx and look it up in their database to show it and that file doesn't actually exist.
the aspx pages do exists .. I think it would make sense to do some reading on what exactly needs to be published .. i.e the aspx pages, the dlls (everything within the bin folder), web.config, etc .. no .vb or .cs files ..  can you please confirm what type of pages you have within the folder?

Rejo
SOLUTION
Link to home
membership
This solution is only available to members.
To access this solution, you must be a member of Experts Exchange.
Start Free Trial
Yes I know the files are correct. We have an httpHandler defined in the web.config:

    <httpHandlers>
      <add verb="*" path="*.aspx" type="CoolWave.HTTPHandlers.CoolWaveHTTPHandler,CoolWave"/>
    </httpHandlers>



Which means if someone requests a page like Home.aspx which doesn't physically exist it will pull it from the database. Here is the GetHandler Function of the HttpHandler:

     Public Function GetHandler(ByVal context As System.Web.HttpContext, ByVal requestType As String, ByVal url As String, ByVal pathTranslated As String) As System.Web.IHttpHandler Implements System.Web.IHttpHandlerFactory.GetHandler

            If IO.File.Exists(context.Server.MapPath(url)) Then
                Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath(url), context)
            End If

            If context.Request.QueryString("Page") = "" Then
                Dim BaseURL As String = context.Request.ServerVariables("HTTP_HOST").Replace("www.", "")

                Dim PageTitle As String = LCase(url).Replace(LCase(context.Request.ApplicationPath), "").Replace("/", "").Replace(".aspx", "")


                Dim DT As DataTable = GSQL.SQLSelect("SitePage", "SELECT     SiteDomain.URL, SitePage.SitePage_GUID FROM SitePage LEFT OUTER JOIN SiteDomain ON SitePage.Site_GUID = SiteDomain.Site_GUID " & _
                    "WHERE SitePage.Title like @PageTitle AND SiteDomain.URL like @URL", New SqlClient.SqlParameter("@PageTitle", PageTitle), New SqlClient.SqlParameter("@URL", BaseURL)).Tables(0)

                If DT.Rows.Count > 0 Then

                    'context.Response.Redirect("default.aspx?Page=" & DT.Rows(0)("SitePage_GUID").ToString)
                    context.Items.Add("Page", DT.Rows(0)("SitePage_GUID").ToString)
                    Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), context)
                Else
                    Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), context)
                End If
            Else
                context.Items.Add("Page", context.Request.QueryString("Page"))
                Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), context)
            End If
        End Function
I have neved used PageParser.GetCompiledPageInstance method so I might not be correct here .. but your code seems to be getting a value from the database if the page is not existing and replacing it with default.aspx
>> If DT.Rows.Count > 0 Then

                    'context.Response.Redirect("default.aspx?Page=" & DT.Rows(0)("SitePage_GUID").ToString)
                    context.Items.Add("Page", DT.Rows(0)("SitePage_GUID").ToString)
                    Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), context)
                Else
                    Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), context)
                End If

I assumed this because both "if" and "else" part has default.aspx ..

but coming back to your question what happens if the page is not there? if home.aspx is supposed to be there, then the delpoyment will have copied it .. if default.aspx is a page that is supposed to be there, then the aspx page will be there .. so in your case when home.aspx is requested and it is not physically present, your code will display default.aspx .. so is this not what you want? if yes, how will the deployement process change the logic? the code will work regardless of the deployment approach you have taken .. maybe I am totally confused and way off with your requirement ..

Are you trying to do URL rewriting? something like this? http://codebetter.com/blogs/jeffrey.palermo/archive/2005/08/10/130532.aspx

Rejo
The problem seems to be in the PageParser.GetCompiledPageInstance since the instance of the page is compiled into the dll.
I recompiled setting the site to updatable and that seems to of fixed the GetCompiledPageInstance problem but the site is still very very sluggish. Like it is taking about 20 seconds per page to load.
do give this a try .. again some trial and error ..
replace the context with httpcontext.Current in all the places ..

Return PageParser.GetCompiledPageInstance(url, context.Server.MapPath("default.aspx"), httpcontext.Current)

just ensuring that it is definely looking at the current context and so not recompiling for every access ..

Rejo
Ok i have made that change and I have actually setup 2 sites one running precompiled and one not.

The precompiled site is at:
http://pre.waycoolsw.com

and the other is our production site:
http://www.waycoolsw.com

The precompiled site still seems very sluggish.
It seems like the initial response of the first few pages are just as slow as before but it eventually catches up and passes the non-precompiled version. I am doing a pre-compilation for deployment on my development machine and ftping it up. Is it trying to initialize the IIS Metabase that is taking so long? Should I do an In Place Pre-compilation to get rid of this initial time?
it shouldn't matter .. but you can gibe that a try ..
After a lot of work we determined there were several problems with our server and our router that compounded our performance problems. But there are significant performance improvements with the precompilation.