Automating Linux Anti-Virus Using ClamAV and Cron
Thankfully Linux isn’t a platform which has a significant problem with Viruses, however it is always better to be safe than sorry. Luckily ClamAV is an excellent free anti-virus solution for Linux servers. However, at least on RedHat Enterprise 5 (RHEL5) the default install doesn’t offer any automated scanning and alerting. So here is what I’ve done:
The following steps assume you are using RHEL5, but should apply to other Linux distributions as well.
First, you’ll want to install ClamAV:
/etc/init.d/clamd start[/bash]
On RHEL5 at least this automatically sets up a daily cron job that uses freshclam to update the virus definitions, so that’s good.
Next I recommend removing the test virus files, although you can save this until after you test the rest of the setup:
Now we want to setup our automation. I have a daily cron job that scans the entire server which can take several minutes, and then an hourly cron job that only scans files which were created or modified within the last hour. This should provide rapid notification of any infection without bogging your server down for 5 minutes every hour. The hourly scans run in a couple of seconds.
Each scanning script then checks the scan logs to see if there were any infected files found, and if so immediately sends you a notification e-mail (you could set this address to your mobile phone’s SMS account if you wanted).
The Daily Scan:
Paste in:
# email subject
SUBJECT="VIRUS DETECTED ON `hostname`!!!"
# Email To ?
EMAIL="[email protected]"
# Log location
LOG=/var/log/clamav/scan.log
check_scan () {
# Check the last set of results. If there are any "Infected" counts that aren’t zero, we have a problem.
if [ `tail -n 12 ${LOG} | grep Infected | grep -v 0 | wc -l` != 0 ]
then
EMAILMESSAGE=`mktemp /tmp/virus-alert.XXXXX`
echo "To: ${EMAIL}" >> ${EMAILMESSAGE}
echo "From: [email protected]" >> ${EMAILMESSAGE}
echo "Subject: ${SUBJECT}" >> ${EMAILMESSAGE}
echo "Importance: High" >> ${EMAILMESSAGE}
echo "X-Priority: 1" >> ${EMAILMESSAGE}
echo "`tail -n 50 ${LOG}`" >> ${EMAILMESSAGE}
sendmail -t < ${EMAILMESSAGE}
fi
}
clamscan -r / –exclude-dir=/sys/ –quiet –infected –log=${LOG}
check_scan[/bash]
The Hourly Scan:
Paste in:
# email subject
SUBJECT="VIRUS DETECTED ON `hostname`!!!"
# Email To ?
EMAIL="[email protected]"
# Log location
LOG=/var/log/clamav/scan.log
check_scan () {
# Check the last set of results. If there are any "Infected" counts that aren’t zero, we have a problem.
if [ `tail -n 12 ${LOG} | grep Infected | grep -v 0 | wc -l` != 0 ]
then
EMAILMESSAGE=`mktemp /tmp/virus-alert.XXXXX`
echo "To: ${EMAIL}" >> ${EMAILMESSAGE}
echo "From: [email protected]" >> ${EMAILMESSAGE}
echo "Subject: ${SUBJECT}" >> ${EMAILMESSAGE}
echo "Importance: High" >> ${EMAILMESSAGE}
echo "X-Priority: 1" >> ${EMAILMESSAGE}
echo "`tail -n 50 ${LOG}`" >> ${EMAILMESSAGE}
sendmail -t < ${EMAILMESSAGE}
fi
}
find / -not -wholename ‘/sys/*’ -and -not -wholename ‘/proc/*’ -mmin -61 -type f -print0 | xargs -0 -r clamscan –exclude-dir=/proc/ –exclude-dir=/sys/ –quiet –infected –log=${LOG}
check_scan
find / -not -wholename ‘/sys/*’ -and -not -wholename ‘/proc/*’ -cmin -61 -type f -print0 | xargs -0 -r clamscan –exclude-dir=/proc/ –exclude-dir=/sys/ –quiet –infected –log=${LOG}
check_scan
Protected System
You should now have a well protected system with low impact to system performance and rapid alerting. Anti-Virus is only one piece of protecting a server, but hopefully this makes it easy to implement for everyone.
Thanks for writing these scripts. Also, there’s an erroneous semicolon on line 19 of the hourly script.
Joshua,
great catch on the semi-colon (fixed it) thanks!!
Glad you like the scripts!
Devon
Nice scripts – thanks for posting them. One problem – filenames with spaces. Example:”Time_spent on the queue-all messages.png”
Naturally, you get the following complaint:
Time_spent: No such file or directory
on: No such file or directory
the: No such file or directory
queue-all: No such file or directory
messages.png: No such file or directory
:)
Although to be fair, I think that’s Clam being dumb, not you ;)
Very good point! Assuming you’re only having the issue with the hourly script, it’s not clam’s fault, it’s mine/linux’s:) However, I’ve just updated the hourly script in a way that should handle files with spaces no problem. Give it a test!
Thanks!
Devon
Hi Devon,
Thanks for these great scrips! Very, very useful.
Two comments that may be helpful: when using -type f I did not pick up viruses in zip files but as soon as I removed -type f altogether, I did find them. Also, the hourly script above runs ‘find’ and ‘clamscan’ twice.
cheers
Good point on the -type f thing. I didn’t check that. The twice running thing is by design, but I had a cut and paste failure when I updated that script. It should run once for -mmin and once for -cmin to find both files created in the last 61 minutes, and files modified in the last 61 minutes. Fixing the script now, good eye!
I implemented the scripts for daily and weekly scans, though I have a question regarding the log file setup.
When I first tried to use the LOG option from the command line, I found that I had to create the scan.log file manually.
My question is how did you setup the log file permissions so that the cron script will write to it?
It should run as root, so it shouldn’t need any special permissions. Just make sure the scan.log file is owned by root, and it can be world readable without too much risk.
Devon
The find command can list files that match on separate criteria, i.e. or/union. This should list the same files as your two-pass approach in one pass, without duplicates:
find / \( -wholename ‘/proc’ -o -wholename ‘/sys’ \) -prune -o -type f \( -cmin -61 -o -mmin -61 \) -print0
Hi There
When the hourly cron runs, I get the following e-mail:
/etc/cron.hourly/clamscan_hourly:
/etc/cron.hourly/clamscan_hourly: line 33: unexpected EOF while looking for matching `”
/etc/cron.hourly/clamscan_hourly: line 37: syntax error: unexpected end of file
Can you please help me to resolve.
Thanks
Something is wrong as the clamscan_hourly script only has 31 lines. If you’re getting errors on lines 33 and 37 there must have been an issue with the copy/paste. Try this: http://pastebin.com/GcZAhqcA
Can u please explain me what is the hourly script doin in
find / -not -wholename ‘/sys/*’ -and -not -wholename ‘/proc/*’ -mmin -61 -type f -print0
I wanna understand why this so solution is performing so well. Thank u !
Sure. Basically this is finding files which have been modified in the past 61 minutes, ignoring files under /sys/ and /proc/ which are often file representations of system data/ports/etc… and will show up as changed all the time but be unscanable/slow to scan/etc…
By checking only files created or changed in the past hour the hourly scan can quickly check for potential new viruses, without doing a full hard drive scan, which happens nightly just in case.
ah ok thank u for the explanation!
i still workin on getting clamd startet for scanning files on-access. while beeing an linux-noob i cannot get clamd reacting on uploaded test files. clamdscan is working but the daemon itself doesnt react on infected files, dont know what to do so… :(
What do you want it to do? Using my scripts it will just e-mail you if a scan finds something, leaving you to take manual action. You can configure clam to automatically delete or quarantine flagged files as well, but that’s something you should read the clamav docs for.
Would there be any way that the clamscan_daily script can be put on the same site where the clamscan_hourly script was? The site where you could paste directly from. Having trouble getting the daily script to work properly. Getting the error: /etc/cron.daily/clamscan_daily: line 9: syntax error: unexpected end of file and: /etc/cron.daily/clamscan_daily: line 7: unexpected EOF while looking for matching `”
Sure thing:
http://pastebin.com/bFpYgnrb
Good Stuff. Thanks for the script works perfectly.
[…] tutorial ( http://172.31.20.105/tech…-scanning.html ) setups daily and hourly scanning and notification if viruses are […]
Bear in mind that the “grep -v 0” will filter out more than you intend, such as 10 or 105 (deity forbid) infected files.
I found that “grep -v ‘: 0′” works better
I just have a simple question,
Using the Daily script, how can I have it auto attach the Log file of that Daily Scan to it when sending the email.
So that I can then see what and where the virus was, without having to log in via Putty every time.
Thanks a mil,
Michael,
that’s a good idea. I don’t have time to update the script right now but if anyone does and wants to share it here, that would be great!
Devon
Hi,
I installed the cron.daily script and it works fine, however, all infected filenames (and full path) end up in the mail headers instead of the mail body. Any idea how to fix this ?
This is a pretty good howto that could greatly be enhanced with a daily run of fresh clam:
23 2 * * * /usr/bin/freshclam
Would do alright. This requires the clamav-update package.
Thanks for this excellent resource – very well explained and easy to implement. Maybe a side-note about the clamavconnector in cPanel wouldn’t go amiss.
All the best,
Dee
I’ve never used cPanel, but I’ll take your word for it!
Sure, it’s just an automated Clam install through cPanel… I made the mistake of installing Clam manually and then installing over it with the cPanel plugin, thereby destroying all my config. Guess you live and learn :-)
Hi, Thank you for the scripts. I am having problems with hackers and need to check files to find where the problem is. I am getting an error after setting it up.
/bin/sh: /etc/cron.hourly/clamscan_hourly: /bin/bash^M: bad interpreter: No such file or directory
Can anyone see from this what I have done wrong?
Thanks
[…] :- [SOLVED] Incoming e-mail virus scan with Evolution – Ubuntu Forums Scan your directories :- Automated ClamAV Virus Scanning | Devon Hillard's Digital Sanctuary kmail,claws it seem supports clam :- Clam AntiVirus openSUSE12.2(Mantis)64-bit/GNOME 3.4.2 […]
THANK YOU for posting these scripts Devon ;)
Jo, you can just replace the shebang part like this:
#!/bin/bash
for:
#!/usr/bin/env bash
bash interpreter, as well as many others, is not always located at /bin/bash but in /usr/bin/bash on many distros. Using env utility is a simple way to improve script portability as it points to where the interpreter is located on the current system.
Cheers!
Frans
Thanks for making these scripts:
A few of the changes I made were:
1. Changed -wholename to -path on the hourly. -path runs everywhere while -wholename does not.
2. Dropped –exclude-dir on the hourly since clamscan receives a filtered stream from find already.
4. Dropped the clamscan -r parameter on the hourly because it serves no purpose with a filtered find stream.
5. Did a cat on the two finds and piped it through sort -zu. (hourly) This avoids scanning the same files twice and produces one entry in the log instead of two.
6. Made a second version of the hourly where I can specify with wildcards where to scan, rather than where not to scan. This gives fine control of what gets scanned, which is also very useful for web servers.
7. Parameterized everything so no code needs to be edited outside of header variables.
*If you see any issues with this logic, let me know.
Thanks!
IT Architect,
that sounds great! Can you share your improved scripts? You can email me: [email protected] if you like. Comments might mess up the formatting.
>Comments might mess up the formatting.Can you share your improved scripts?<
I can, but about the only thing code that resembles what is here is the email alert form. LOL!
Other: I hope you're a Linux guy, because I'm a UNIX guy. I got out of Linux in 2008 only due to the wild server loads we have. One of my goals is that the scripts run on any ?NIX, and about any version, and I don't have a Linux box to try them on. Your timing is also great. Teething issues are over, and I just finished a code cleanup. The code and logic are well commented.
Hey that sounds great! Feel free to email it over to me, or if you post it on your own blog I’ll just link to it or whatever is easiest:)
Yes I’m a Linux guy so happy to test stuff on RHEL 5 for ya! Thanks!
Devon,
Nice work, your site has been VERY informative. I am currently running ClamAV on UNIX and would love to get a copy of IT_Architects version of the script to test in my environment. Do you think you could email me a copy of it?
I would greatly appreciate it, thanks!
http://forum.directadmin.com/showthread.php?t=45896
dear devon
thanks for your excellent script
I’m kind of new on this
do i need to setup crontab -e or creating daily and hourly scan will be fine?
my question is,, when these script will start???
sorry for the dumb question
thanks
sam
Sam,
if you create the scripts in the locations mentioned: /etc/cron.hourly/ and /etc/cron.daily then they will run automatically (assuming you’re running on a system that has those locations – I use RHEL 5). You won’t need to add them to a crontab directly.
my current crontab -e result are:
30 5 * * * /usr/local/cpanel/scripts/optimize_eximstats > /dev/null 2>&1
58 4 * * * /usr/local/cpanel/scripts/upcp –cron
0 1 * * * /usr/local/cpanel/scripts/cpbackup
35 * * * * /usr/bin/test -x /usr/local/cpanel/bin/tail-check && /usr/local/cpanel/bin/tail-check
45 */4 * * * /usr/bin/test -x /usr/local/cpanel/scripts/update_mailman_cache && /usr/local/cpanel/scripts/update_mailman_cache
30 */4 * * * /usr/bin/test -x /usr/local/cpanel/scripts/update_db_cache && /usr/local/cpanel/scripts/update_db_cache
45 */8 * * * /usr/bin/test -x /usr/local/cpanel/bin/optimizefs && /usr/local/cpanel/bin/optimizefs
30 */2 * * * /usr/local/cpanel/bin/mysqluserstore >/dev/null 2>&1
15 */2 * * * /usr/local/cpanel/bin/dbindex >/dev/null 2>&1
15 */6 * * * /usr/local/cpanel/scripts/autorepair recoverymgmt >/dev/null 2>&1
*/5 * * * * /usr/local/cpanel/bin/dcpumon >/dev/null 2>&1
34 22 * * * /usr/local/cpanel/whostmgr/docroot/cgi/cpaddons_report.pl –notify
7,22,37,52 * * * * /usr/local/cpanel/whostmgr/bin/dnsqueue > /dev/null 2>&1
2,58 * * * * /usr/local/bandmin/bandmin
0 0 * * * /usr/local/bandmin/ipaddrmap
0 1 * * Wed /usr/local/1h/bin/1h_updates.sh >> /usr/local/1h/var/log/1h_updates.log 2>&1
* * * * * /usr/local/1h/sbin/data2rrd.pl >> /usr/local/1h/var/log/data2rrd.log 2>&1
*/30 * * * * /usr/local/1h/sbin/rrd2img.pl local >> /usr/local/1h/var/log/rrd2img.log 2>&1
*/5 * * * * /usr/local/1h/sbin/apply_changes.pl >> /usr/local/1h/var/log/local-interface.log 2>&1
7 0 * * * /usr/local/1h/bin/update_expire_times.sh >> /usr/local/1h/var/log/update_expire_times.log 2>&1
59 * * * * /usr/local/1h/bin/mysql_stats_dumper.pl > /dev/null 2>&1
0 * * * * /usr/local/1h/sbin/take_limits_down.pl >> /usr/local/1h/var/log/take_limits_down.log 2>&1
45 23 * * 6 /usr/local/bin/clamav-cron /home
Hey All,
Love your scripts, they have really added so much benifit to my servers. I do have a question if you dont mind. I noticed teh clamscan_daily give me teh following error when it runs
/etc/cron.daily/clamscan_daily: line 27: 30625 Terminated clamscan -r / –exclude-dir=/sys/ –quiet –infected –log=${LOG}
Any idea why this happens and how to fix it?
Thanks in advance
Simple curiosity, is there a reason why on the hourly script scan the last command ‘find’ is repeated twice with the same parameters ?
It isn’t. You can’t see it well on this site, but if you copy/paste you’ll see that one lines uses -cmin and one uses -mmin, looking for files that were created or modified in the last 61 minutes. It’s possible that newly created files have a modified timestamp of their creation, so it may not be needed… I have’t checked that.
[…] http://www.soumaislinux.com.br/artigos/1289/instalando-o-clamav-no-centos-pelo-cpanel http://172.31.20.105/tech-blog/debian/automated-clamav-virus-scanning.html http://servertechz.com/linux/clamav-scan-commands-and-examples/ http://www.devin.com.br/crontab/ […]
I’ve found the “default” clamav rules to be ineffective against the usual trojans/malware uploaded on hacked server accounts.
I recommend using special rules such as the atomic security clamav signatures.
[…] # Email alert cron job script for ClamAV # Original, unmodified script by: Deven Hillard #(http://172.31.20.105/tech-blog/debian/automated-clamav-virus-scanning.html) # Modified to show infected and/or removed files # Directories to scan SCAN_DIR="/home /tmp /var" […]
Hello.
Thanks for this scrips. They are very helpfull.
I have only one question.
How to get in the reports date and time of scanning???
Thanks in advance
Wesley
You just want to get the date and time inserted into the email or? just add an echo `date` >> in there.
Where?? there is no link :-))
In the set of echos that generate the email. Although really the email should have a date/time sent anyhow right?
Rigth.
One more question.
Scan report do not shows the location of the infected file.
How to add this to the scan report??
i wonder that no one actually fixed this script since 2010.. with 52 comments…
first you have issues in this line ” if [ `tail -n 12 ${LOG} | grep Infected | grep -v 0 | wc -l` != 0 ] ”
you dont have to use “tail”, because you already have “grep Infected”, then “grep -v 0” will match only non-zero, so if Infected files 250 or 105 for example, it will be never reported because of 0, it simply excluded from grep results… then you have to reset log every day, and copy infected log aside also… aand do not relay on mail and even sendmail too much, because of DNS issues or other problems…
@Viktor,
I’m not sure I agree with you here. You have to use tail since the log file will have the results of MANY runs of the scan, so if you don’t use tail, then you will get false positives on old scans which you may have already addressed (at least until your log rotates). So tail solves that all cleanly with no downside.
You are correct about the grep -v 0 however. Not sure the easiest way to fix that. It’s too early for regexes:)
Not sure what you mean by not using mail because of DNS issues? If you don’t want an email notification I guess you can feel free to do whatever you want, but I like get an email notice:)
you are running clamscan DAILY so reset log after scan, copy infected log aside, and you have –infected option, then only infected files will be logged. so you will have new log every day.
if [ $(grep “Infected.*[1-9].*” ${LOG} | wc -l) != 0 ]
then
…….
…….
cp ${LOG} ${LOG}_CHECK_ME
echo “” > ${LOG}
else
echo “” > ${LOG}
fi
Also running it hourly, same log file. I think having a tail -n 12 is a LOT simpler than trying to bake in log rotation (when you should be using logrotate.d anyhow). Your pseudo code is already 10x more complex than the tail -n 12.
cp ${LOG} ${LOG}_CHECK_ME_${DATE}
Hi Devon,
Thanks for the scripts! I’m currently trying to implement the clamscan_daily script. Unfortunately its not sending me the email messages. I’m currently using postfix, so I’m not sure what I have to change in order to make the script work. Any help in resolving this issue would be greatly appreciated. And one more thing, how would I enable freshclam updates in the script?
Thanks,
Reid
Thanks for this, it worked great! The only adjustment I needed to make was to add –max-filesize parameter to suppress this warning: “LibClamAV Warning: cli_scanxz: decompress file size exceeds limits – only scanning 27262976 bytes”
oops, i meant to say max-scansize
Great tutorial…
I try setup my server… (VPS CentOS 6.6 with Plesk ODIN 12.0.6)
I setup chron ect… but have error message:
/etc/cron.hourly/clamscan_hourly:
/etc/cron.hourly/clamscan_hourly: line 5: Log: command not found
find: File system loop detected; `/var/named/chroot/var/named’ is part of the same file system loop as `/var/named’.
find: File system loop detected; `/var/named/chroot/var/named’ is part of the same file system loop as `/var/named’.
What is a problem? can u help me?
Thank you for all
Great stuff. I use these scripts on my Ubuntu servers. Someone has modified your scripts to show infected/removed files, What do you think about those changes?
https://techknight.eu/2015/01/22/setup-clamav-virus-scan-updates/
Great script!
I use this on my desktop PC, but I use ssmtp, not sendmail. Ssmtp does not support the -t option, so I deleted the 5 “echo” lines 16 thru 20 and change line 22 to:
mail -s “${SUBJECT}” “${EMAIL}” < ${EMAILMESSAGE}
All working fine – thanks.
Chris