Solaris folders to exclude from AV scan

I'm using a rather primitive AV scan for Solaris which doesn't tell
much what to exclude from its scan so as not to trigger systems
operations issue.

I heard that scanning /net  can get into endless loop.

What about /tmp & /tmpfs : these filesystems are world-writable
so they can potentially harbour malware files but /tmp is also
being written to by various apps & will scanning /tmp cause
locking issues?  

Supposedly /proc & /sys should be excluded as well?

Guess database folders/filesystems shd be excluded.

What about clustering solutions (eg: Glassfish, Oracle DB
clustering & other apps clustering) folders?

Only found ClamAV & McAfee's Linuxscan suggesting to exclude
the following:

http://t63127.security-virus-clamav-user.securitytalk.info/clamscan-bug-feature-in-solaris-t63127.html
------------------------------------------------------------------------------------------------------------------------
clamscan -r /var --exclude=sa?? --exclude=syslog* --exclude=sulog
--exclude=messages*
clamacan -r /export/home
clamscan -r /usr
clamscan -r /tmp --exclude=mysql.sock
clamscan -r /etc --exclude=.name_service_door --exclude=.syslog_door
clamscan -r /usr/local/apache --exclude=*log

Avoid scanning /proc, /cdrom, /mnt, /vol, /xfn as you will be wasting your time.
If you scan /home you may run into the same problem as with /net. Scan /dev,
/devices, /kernel, and /platform at your peril.

You will want to exclude door files, very likely sparse files, db tables and
indices, and other special files such as Unix sockets and device files.


McAfee Linuxscan:
--------------------------
• Oracle database files
• /opt/oracle/.*.dbf (if oracle is installed under /opt)
• /opt/oracle/.*.ctl (if oracle is installed under /opt)
• /opt/oracle/.*.log (if oracle is installed under /opt)
• Evolution data files
• Thunderbird data files
• Encrypted files
• /var/log for on‑access scan
• /quarantine and /proc for on‑demand scan
• JAR files for on‑access scan
sunhuxAsked:
Who is Participating?
I wear a lot of hats...

"The solutions and answers provided on Experts Exchange have been extremely helpful to me over the last few years. I wear a lot of hats - Developer, Database Administrator, Help Desk, etc., so I know a lot of things but not a lot about one thing. Experts Exchange gives me answers from people who do know a lot about one thing, in a easy to use platform." -Todd S.

sunhuxAuthor Commented:
My AV scan doesn't have an  "--exclude=" option but I would
still like to scan /tmp  &  /var.

What's the impact of scanning /var/.../syslog* ,  /var/.../sulog*
& /tmp/mysql.lock ?  Any disruption to apps/services if scanned?
0
sunhuxAuthor Commented:
Also, what's the impact if  *.jar files are scanned?
0
btanExec ConsultantCommented:
In fact, we are saying to have AV in Unix. Possible AV is shared in though old but I believe it still stands since focus has been Windows mostly https://blogs.oracle.com/jimlaurent/entry/anti_virus_software_for_solaris
For Symantec, it has recommended resource setting some there is optimal usage and scanning
https://support.symantec.com/en_US/article.HOWTO79625.html#id-SF050111584

Regardless, I see it as a whole, file size as one of the impactful contributor to decide which to exclude scanning. By default, it is always to scan everything in disk but we know that is operationally unsound. I understand Clamav scan is based on hash though it also has its own (like other AV) heuristic, on demand, real time access scan and recursive scan capability and flexibility. But all knows that those can be resource intensive. Hence, it is pretty tough to gauge a generic baseline and even just use the default install setting to go ahead and monitor. We do want to avoid even allowing static large files like image, log, db and sdk runtime including clamav files itself being scanned where possible. There isn't a security practice but a performance fit for your design approach that you need to assess as time go and tune it (including peak and non-peak duration...like scan during lunch time or downtime declared.

... I foresee in Solaris (which is also not like Windows) will need some trial and error rather than coming up with a golden "one size fit all" scan template.

I am no Solaris savvy but noted there are past ClamAV discussion which you may want to (or already) look into. I tend to see it more of as selective scan strategy instead of trying to come up with exception list  (sort saying whitelist vs blacklist - the latter will grow and hard to maintain while the former are those really actively used and hence more concern to control it with continuous real data coming and out from systems)
Don't be so worried about using clamscan instead of clamdscan for large
scans. The difference in performance is insignificant. clamdscan only
has a big advantage over clamscan when scanning a single file or only a
few files at a time.
http://clamav-users.clamav.narkive.com/zDhYHs0s/exclude-with-clamdscan
This is a user problem, not a software problem. The solution
is to scan selectively as in (examples only - this message requires you to think):

clamscan -r /opt
clamscan -r /var --exclude=sa?? --exclude=syslog* --exclude=sulog
--exclude=messages*
clamacan -r /export/home
clamscan -r /usr
clamscan -r /tmp --exclude=mysql.sock
clamscan -r /etc --exclude=.name_service_door --exclude=.syslog_door
clamscan -r /usr/local/apache --exclude=*log

Avoid scanning /proc, /cdrom, /mnt, /vol, /xfn as you will be wasting your time.
If you scan /home you may run into the same problem as with /net. Scan /dev,
/devices, /kernel, and /platform at your peril.

You will want to exclude door files, very likely sparse files, db tables and
indices, and other special files such as Unix sockets and device files.
http://mailing.unix.clam-users.narkive.com/qtgPOPDG/clamav-users-clamscan-bug-feature-in-solaris

May be good to review option in clamav as well specific to file extension including archive ones
--scan-pe[=yes(*)/no]                Scan PE files
    --scan-elf[=yes(*)/no]               Scan ELF files
    --scan-ole2[=yes(*)/no]              Scan OLE2 containers
    --scan-pdf[=yes(*)/no]               Scan PDF files
    --scan-html[=yes(*)/no]              Scan HTML files
    --scan-archive[=yes(*)/no]           Scan archive files (supported by libclamav)
    --detect-broken[=yes/no(*)]          Try to detect broken executable files
    --block-encrypted[=yes/no(*)]        Block encrypted archives
    --max-scansize=#n                    The maximum amount of data to scan for each container file (**)
    --max-files=#n                       The maximum number of files to scan for each container file (**)
    --max-recursion=#n                   Maximum archive recursion level for container file (**)
(*) Default scan settings
(**) Certain files (e.g. documents, archives, etc.) may in turn contain other
http://www.experts-exchange.com/Software/Anti-Virus/Q_27723670.html#a37988324
0
Introducing the "443 Security Simplified" Podcast

This new podcast puts you inside the minds of leading white-hat hackers and security researchers. Hosts Marc Laliberte and Corey Nachreiner turn complex security concepts into easily understood and actionable insights on the latest cyber security headlines and trends.

sunhuxAuthor Commented:
I was given a commercial Solaris AV scan (which I must use as it's our corporate standard
as dictated) but it lacks the option to "--exclude=..." folders/files.

I'll need to scan folder by folder but in /tmp, there are certain files in /tmp we don't
want to scan.

Guess I'll have to do something like:
"./scan  `find /var/* -print | grep -v sulog | grep -v syslog` "
0
sunhuxAuthor Commented:
I also have issue that in the event it got overrun, I'll have to automatically terminate it
0
btanExec ConsultantCommented:
i suppose you are not saying Clamav is the commercial ver you have for your Enterprise. ClamAV is still free and opensource. Since it is back to the Ent AV then why spend effort using Alt AV, the Ent AV support should help instead otherwise go for tech refresh. AV has limit esp in non-Windows OS..

Regardless, the ./scan example will scan those folder and files under the "sulog" and "syslog" - if that is the intent to only scan these ... the log overrun will be expected as those log file can be huge as they are "appended" file. Overall, I do not see scripting can solve even if you drill further - eventually the limit will reached (in this case easily as ClamAV is "greedy" too), hardware RAM need to increase, which is why I stated tech refresh..(either switch AV with one that will not crash on overrun)
https://www.howtoforge.com/community/threads/clamav-eating-memory-and-stopping-mail-traffic.59074/

...just find that we are "barking on the wrong tree" (Pardon me...)
0
sunhuxAuthor Commented:
No, it's not ClamAV I'm given but something else that's commercial
but rather primitive in its features.

>I foresee in Solaris (which is also not like Windows) will need some trial
>& error rather than coming up with a golden "one size fit all" scan template.
Agree, when users upload huge zips (can reach 3GB with tens of thousands
of files in it), the scan will simply ran for days.

I'm suspecting certain files in /dev & /boot could cause the AV scan to
go into a loop, so as contingency, I'm running the Solaris AV scan with
a "timeout -k 1m 28800s nice /path/scan_command ..."  so that if it
runs for more than 8 hrs, it will self abort: I'll then check the logs to
see which file is the culprit, skip it & run.  We'll never know when users
upload what files that may cause the AV scan to go into a loop
0
btanExec ConsultantCommented:
thanks for clarifying. the zip can be "killer" as AV can go recursive including with depth of unzip .. good to limit the file size and form the scan result, you can then better baseline which is often the "big" file. For example, ClamAV should have a size-file limit: so the files over a certain size are excluded by the engine. It has option such ad --max-filesize=100M (in this case limit to 100Mbytes file). There are more specific from "man" and another option on scansize
man page states:

--max-filesize=#n
Extract and scan at most #n kilobytes from each archive. You may pass the value in megabytes in format xM or xm, where x is a number. This option protects your system against DoS attacks (default: 25 MB, max: <4 GB)
--max-scansize=#n
Extract and scan at most #n kilobytes from each scanned file. You may pass the value in megabytes in format xM or xm, where x is a number. This option protects your system against DoS attacks (default: 100 MB, max: <4 GB)
for info - In order to have a log specify with the -l option i.e. -l clamav.log

I supposed of AV has limit in scanning big file too, inclusive of ClamAV. It has limits like:
- limit relative to the size of the file. If file is too large, clamav may exclude it.
- limit wrt data amount to scan per single file. If file is too large, clamav read it but do not scan it.
From past forum,
Currently, ClamAV has a hard file limit
of around 2.17GB. Because we're mapping the file into memory, if you don't
have enough memory available to map the whole file, the memory mapping code
(as currently implemented) will fail and the file won't be scanned.
I also do not think malicious codes can be of such large size and of course not neglecting injected within big file but how secure can we get. We just want to make sure AV does not impact services running

just for info - I believe those option if done in the clamd.conf file which is the configuration file for the clamd service, the changes will not have effect to the clamscan tool.
0

Experts Exchange Solution brought to you by

Your issues matter to us.

Facing a tech roadblock? Get the help and guidance you need from experienced professionals who care. Ask your question anytime, anywhere, with no hassle.

Start your 7-day free trial
btanExec ConsultantCommented:
one probably useful point which may impact file scanning overruns for clamAV is
The key seems to be to set the --bytecode-timeout= high so the scanner has time to scan the whole file. The default value is 60000 milliseconds/60 seconds, and I have set it to 190000 which works and doesn't give the timeout errors. This value could probably be set lower but it works for me. Tested on two systems that had the errors before the setting.

UPDATE:

Tested on three systems and many scans, the errors are gone with this setting for --bytecode-timeout.

Here is the new command:

clamscan -r -i --remove --max-filesize=4000M --max-scansize=4000M --bytecode-timeout=190000 /DATA1
Note:

I also upgraded the servers memory to 8GB, I'm not sure if clamscan loads the file to memory when it's being scanned but one post said that much and if so that is another consideration.
0
sunhuxAuthor Commented:
Just found that I can't scan  Solaris socket & Fifo files else it will go into a loop or got stuck
0
btanExec ConsultantCommented:
typically the file open object is long esp with threads holding to those and doubt clamav can scan signature based on memory mapped file effectively. the self-triggered scanning should not done during working peak hours too as practice and is more of maintenance period unless it is real time and on access scan that is transparent that the Enterprise version should be handling
0
It's more than this solution.Get answers and train to solve all your tech problems - anytime, anywhere.Try it for free Edge Out The Competitionfor your dream job with proven skills and certifications.Get started today Stand Outas the employee with proven skills.Start learning today for free Move Your Career Forwardwith certification training in the latest technologies.Start your trial today
Unix OS

From novice to tech pro — start learning today.

Question has a verified solution.

Are you are experiencing a similar issue? Get a personalized answer when you ask a related question.

Have a better answer? Share it in a comment.