This is a Urgent item,
We have 3 servers.
- Production Server
- Qualification Server
- Developmental Server
We have a Web Application that is installed on on three servers with identical Web.config settings on all three systems, only difference would be the connection string to the SQL Servers. All three systems are connected to identical copies of the database so the data loaded is the same.
My Issue is the Production system is running and 5 Times the RAM usage than the other two systems, and this usage shows up right as the application starts up.
- Production Server RAM Usage at start up is 19.4 GB
- Qualification Server RAM Usage at start up is 4 GB
- Developmental Server RAM Usage at start up is 4.2 GB
I have gone through settings to compare the application pools and I do not see any differences.
I am looking for a setting that might have been accidentally enabled on the Production system that is causing this high RAM Usage.
The Production system cannot be taken down over and over, so I have to be careful in any edits I do to fix the problem, so I am more trying to reproduce the high memory issue in the Qualification environment.
This is urgent so any help is appreciated.