Just looking for some feedback based on Virtual Memory Vs RAM?
what it is, is we have on old .net system running with RAM which performs great and the new system runs in a VM cluster environment and theoretically it has much more RAM and more processors than that really old one, much more, but that’s not for real as it’s only a virtual allocation on a system shared with 40 odd other VMs.
we have noticed making a few calls to the sql server 2012 DB takes an extra second or 2, would this be down to the virtual memory?
in the old system all the calls to the db was made in the front end whereas the new system is split up into layers to taking away the heavy load in the app's front end.
so do you think that getting the new system running on a server with RAM will perform way better?
appreciate all opinions