Impact of restricting RAM usage of a process in Win 2008 R2 & Win2012 & how to test/verify for adverse impact

sunhux used Ask the Experts™
We have some AV process in Windows 2008 R2/Win2012 which could chew up to 320MB of RAM &
the principal told us this is expected/normal esp when Realtime (ie On-Access) scan is enabled & it
includes folders containing many files, or files with high compressions.

On rare occasions, we'll see hard page fault of around 5-15.

Our tenants are not happy with this reply & wanted folders that contain huge number of files as
well as folders with compressed files to be excluded from the Realtime scan.
Excluding those folder from realtime scan means there's a risk those folders are infected & no
realtime protection.  Would implementing the above measure of restricting RAM has any impact
on the realtime scan's performance such as slower performance of the AV.

How can we test
Watch Question

Do more with

Expert Office
EXPERT OFFICE® is a registered trademark of EXPERTS EXCHANGE®
It's really hard to predict what would happen without knowing which AV program you are dealing with. I know there are some out there that are real memory and real-time  hogs. On servers, I tend to avoid these programs.  Unless you are running RDP on the server, straight check each file that is copied or created on the server, and check all the files during idle times and check each program that is run on the server.  You don't need personal firewalls, via your AV program.  That's why you have built in firewalls and routers.  Leave the "Take over full control of this workstation because the user doesn't have brains enough to safely use this workstation and in the process, bring the workstation to it's needs" to the home computers where the children have unrestricted access to everything.

We have developed standard Business Employment Standards with the Clients that we handle the IT for.  

Advise all existing employees and new hires of the proper use of the Business's Computers.
That no personal use is permitted without permission of a Manager. Anyone bringing in any type of media from outside that adversely affects the network, will be terminated with cause, first offense, no warnings. The  same holds true of employees who do anything that causes downtime on the network due to their actions, because they intentionally ignored the instructions regarding the proper use of company resources.

All employees must sign a statement that they have read and understand the rules.  It only takes one or two people being terminated to convince the rest that their employers are serious.


It's not desktops/laptops that we are running the AV software but on our
tenants' VMs that are hosted in our Cloud.

It's Deep Security CoreShellService process that I'm referring to


Those VMs are Win2008 R2 and Win2012 servers with RAM ranging
from 2GB to 16GB.  To some of the tenants, taking up 320MB is a
considerable amount
Ensure you’re charging the right price for your IT

Do you wonder if your IT business is truly profitable or if you should raise your prices? Learn how to calculate your overhead burden using our free interactive tool and use it to determine the right price for your IT services. Start calculating Now!


Still need an impact assessmt of restricting the RAM on coreshellservice & do we test/verify its impact?
Top Expert 2016
It really depends upon the program itself.  Some will continue to work but with reduced efficiency others will simply fail silently or you may blue-screen.

One does a cost/benefit and risk assessment perhaps using a program that requires less resources. In many cases windows defender is good enough. One can simply disallow execution of programs on user shares using ntfs file permissions.


I can't use Windows Defender as it's not endorsed by our corporate.

Thing is I have not seen if Win Defender will truly use much less RAM.

How to verify the eficiency is reduced with RAM restriction?
Top Expert 2016

this can only be done by testing the application in a lab environment
Exec Consultant
Distinguished Expert 2018
if the intent is to ensure AV does its work to make sure critical files are scanned on demand, throttling the RAM resources seems to "starve" the AV process instead causing it ineffective for comprehensive scan. So there is still risk that files are not totally assuming there are concurrently access to all such files like in case f scheduled batch job for backup or transfer to network drive for syncing purpose.

So I thought the a risk measured approach need to be balanced out and make sure your user understand the exposure for throttling and disable leave the AV blind which defeats the purpose of safeguard. Either swap AV otherwise target the on demand use case on potential root cause for such page fault. There is need to whitelist those files tat is big in size and do not normally changed like video or duplicated old archive since it has not be touch for a while (and should be archived and removed to backup store) as compared to dynamic and working subfolder or document.

Need to re-assess the storage of those files instead of always targeting the AV which can only give limited benefits by starving its process. If the AV is ain't good or optimal then why bother even have it in the first place- maybe can do a virtual guest and have hypervisor AV appliance do the scan instead.

PerfMon using the counter such as the "Process", "Memory" and "Processor" performance objects. You then can see these counters in real time. Is it really always so high and these are all symptoms leading to the page fault...there also be other processes running too. Process Explorer  is another useful to drill into the nuts of the dll loaded too.. but if you still wish to explore "trimming" the memory, Process Lasso may be a candidate - see 
Perhaps the biggest reason we object to these 'optimizers' is that the implementation of every one we've evaluated has been atrocious. Many allocate as much memory as they can, until they are finally denied memory. Then they release it. Thus ‘cleaning' your virtual memory.

Others use the correct APIs to clear the working set. However, they do so for ALL running processes, regardless of whether the process has any excessive RAM use or not. That immediately causes those processes to page back in the memory that was being actively used (page faults). Every time they do this your PC has to essentially pause and recover from this brutish operation!

Bitsum's SmartTrim, part of Process Lasso, is an effort to mitigate the negative effects of improperly written RAM optimizers & imperfect RAM usage, while giving users the control they desire.

Do more with

Expert Office
Submit tech questions to Ask the Experts™ at any time to receive solutions, advice, and new ideas from leading industry professionals.

Start 7-Day Free Trial