Guideline for zipped & non zip files to be excluded from scanning

Posted on 2014-10-23
Last Modified: 2014-10-28
Trend's Deep Security has given us list of files (DB, certain Sharepoint files etc) to be excluded from
their AV scan as they will cause issue.

Trend's support has advised that zip files (esp those with many files zipped into one zip) will take
more resources to be unzipped, scanned & was told Deep Security will rezip it back.  They advised
that malware/viruses usually tend to infect smaller files & rarely infect big files but did not give
specific sizing of the files & the size of zip & non-zip files that will trigger systems performance

Anyone / any other AV products has any white paper / guidelines on
a) what's the file size above which malware generally won't infect
b) at what sizings (& number of files in a zip) that it's recommended
    not to scan so as not to affect performance.  In particular I have
    customer/tenants that use our facilities & publish zip & non-zip
    files & I would say it's fair if it takes 30 seconds to scan a published
    file, above which, the user will get unhappy

We run DS on-demand and realtime scan in Windows 2008 R2,
RHEL 5.x/6.x (realtime only) & Solaris x86 (on-demand)
Question by:sunhux
  • 3
  • 2
LVL 69

Assisted Solution

Merete earned 100 total points
ID: 40401203
Hi, personal opinion I don't think this is possible to set since malware constantly evolve/recode their scripts hence the reason we have updates to our our virus detection definitions,
if a weakness can be found they'll find it no matter the size.
a) what's the file size above which malware generally won't infect<< no such thing
b) at what sizings (& number of files in a zip) that it's recommended not scan, well Trend recommends for Compressed File Handling see Compressed files scanning restrictions:
LVL 61

Accepted Solution

btan earned 400 total points
ID: 40401910
I don't suggest we go by file size to scan but for performance sake based on AV, i do consider that their default is the best practice specific to their processing. There is no so called best practice as it is specific to the AV used.

E.g. in Symantec it has the Max File Size is 2 GB per File and if the File is a Container-File like ZIP etc. the unzipped size (including zip in zip) must not exeed 30 GB (30719MB).

But really do we even have process Files with Size up to 5 MB which is already consider quite huge already. I know the email attachment can go bigger than this but not often we do that and like need file transfer etc. I say the AV should not have limit though but earlier mentioned - take there recommended or default and tune as needed to your h/w and throughput. I do not see magic number appearing immediately but minimally you has the granularity to customise the sizing (this actually also complicate administrative task, go simple and have the biggest consumer in organisation to test out on their norm biggest file load then.

Nonetheless, you can check out the practice guide from TM

1 -(pdf) DSR page 34, it stated the recommended action and setting.
(Recommended Real-time Scan Configuration )
Maximum size of individual extracted files - 30
Maximum Levels - 2
Maximum number of files to extract -10
OLE Layers to Scan - 3

(Recommended Scheduled Scan Configuration )
Maximum size of individual extracted files - 60
Maximum Levels - 3
Maximum number of files to extract -10
OLE Layers to Scan - 3

(Recommended Manual Scan Configuration )
Maximum size of individual extracted files - 60
Maximum Levels - 2
Maximum number of files to extract -10
OLE Layers to Scan - 3

2-  (pdf)
- See the "Advanced Options (Scan Restriction Criteria)" for
Decompressed file count exceeds ( default value is 9999), Size of decompressed file exceeds (default value is 100MB), Number of layers of compression exceeds (default value is five (5)), Size of decompressed file is “x” times the size of compressed file (default value is 1000)

Also do note that scanning a compressed file that might cause a Denial-of-Service (DoS) attack. A Denial-of-Service (DoS) attack happens when a mail server’s resources are overwhelmed by unnecessary tasks. So it is wise to also prevent AV from scanning files that decompress into very large files helps prevent this problem from happening. Hence those field do play a part  - just need to profile and tune as you go along ...

Author Comment

ID: 40404096
This is quite a 'tricky' requirement: as the users published their files
& will need to view the outputs of the scan results on-the-fly, I would
say 20-30 secs of scanning is the max they can wait before they got

So be it zip or non-zip files, I'm trying to work out what is this 'magic'
figure of the file sizing, above which the scan will take more than 20
Why You Should Analyze Threat Actor TTPs

After years of analyzing threat actor behavior, it’s become clear that at any given time there are specific tactics, techniques, and procedures (TTPs) that are particularly prevalent. By analyzing and understanding these TTPs, you can dramatically enhance your security program.

LVL 61

Assisted Solution

btan earned 400 total points
ID: 40404394
indeed tricky but I do not think there is a magic no unless you try it out with baseline using the same spec as user machine. You have to do it for manual scan since user trigger the scan. For a start, try the recommended,

- if the time meet the 20-30sec then I suggest you go size 10% lower for buffering as there can be background process eating resource - it is minimal impact though.
- if the time is beyond 20-30sec, go for 50% lower and do another baselining and likewise if it is within the acceptable limit then have another 10% size lower.
Unless the support can suggest some perform figure which normally they will not as they will say environment specific and many dependency. So you likely have to do it yourself.

there is some old paper on DS comparison with some performance spec but timing of scan is not stated though (see "on demand scan"). it stated some file size prepopulated, for 50VM and took 14hr 16min

best practice guide for DS 9

Author Comment

ID: 40408217
Just thought of MS Excel (& a recent Powerpoint vulnerability) that may
have malicious macros : Ok, the largest Excel/Ppt file I've seen is 20MB.

We may have users uploading Excel to Sharepoint servers/VMs, so I
guess 20 MB is a decent magical number
LVL 61

Expert Comment

ID: 40408228
yap test it as form of profiling and I should see this magic no isnt going to pose any issue to the TM DS unless support advice otherwise

Featured Post

Better Security Awareness With Threat Intelligence

See how one of the leading financial services organizations uses Recorded Future as part of a holistic threat intelligence program to promote security awareness and proactively and efficiently identify threats.

Join & Write a Comment

This article summarizes using a simple matrix to map the different type of phishing attempts and its targeted victims. It also run through many scam scheme scenario with "real" phished emails. There are safeguards highlighted to stay vigilance and h…
Transferring data across the virtual world became simpler but protecting it is becoming a real security challenge.  How to approach cyber security  in today's business world!
It is a freely distributed piece of software for such tasks as photo retouching, image composition and image authoring. It works on many operating systems, in many languages.
Access reports are powerful and flexible. Learn how to create a query and then a grouped report using the wizard. Modify the report design after the wizard is done to make it look better. There will be another video to explain how to put the final p…

758 members asked questions and received personalized solutions in the past 7 days.

Join the community of 500,000 technology professionals and ask your questions.

Join & Ask a Question

Need Help in Real-Time?

Connect with top rated Experts

20 Experts available now in Live!

Get 1:1 Help Now