Kurt Seifried, [email protected]
There's been talk about UNIX viruses, and more specifically Linux viruses. Recently a trojan was released for Linux called "Remote shell trojan", and widespread reports likened it to Code Red. Nothing could be further from the truth. This "Remote shell trojan" is actually a virus, and a poorly done one at that. Many of these articles seem to have gotten a hold of virus experts, which is good, but it seems these virus experts generally have little knowledge of UNIX, which isn't surprising as most viruses and anti-virus software is written for Windows (only in the last year or two have many anti-virus vendors ported their products to UNIX platforms). First of all let's define the problem, and some of the more important terms, since some of the articles I have seen seem to mix things in strange ways. A virus is a piece of software that can infect data and applications, often replicating itself, it may or may not be harmless (intentionally or otherwise). For example some viruses append themselves to executable files, and these files can be later spread when users share files. Other viruses (like the Melissa virus) would simply rifle the contents of your email address book, and then send itself to all the people listed in it, severely overloading mail servers in the process.
Getting back to basics however, viruses have to be run by users to do anything. In general home users are their own administrator, however with Windows 95/98 there is no distinction between the user and the administrator, and additionally there is absolutely no file security. This means that anyone running any software can do anything to a Windows 95/98 box, from installing extra software to formatting the hard drive. Additionally, the vast majority of Windows users do not have up to date anti-virus software (if at all), as compared to corporate environments which usually have anti virus software with automated update methods. Home users running various UNIX platforms typically have a root account (like administrator in NT), and a user account, and most UNIX versions make it very clear that root should not be used for normal access. When you are logged in as a user, on most UNIX platforms, you can do very little (if at all) to harm the system. Typically the only areas a user can actually write data to are the /tmp directory, and their home directory. Directories with configuration files (such as /etc) or binaries (/bin, /sbin) are typically protected from users (I haven't seen a modern UNIX that doesn't). So even if a user does run an infected file, chances are it cannot do to much damage. There are of course exceptions to the rule, several major ftp sites have been broken into, and various popular software packages have been modified, however these are usually caught quite quickly when someone checks the file checksums or signatures (which oddly enough most Windows sites do not seem to bother with).
As far as corporate environments go, they tend to be heavily homogenous, with virtually all of the desktop machines running the same version of windows, hopefully NT. NT does support file permissions, and typically users do not have administrative rights to their local machine (this tends to increase support costs because users are prone to errors). Unfortunately the default file permissions for NT (both server and workstation) are everyone (literally a group called everyone) has full control (meaning they can change permissions as well as modify files) for the entire system. Tightening up these permissions is problematic since many popular software packages require write access to system directories. Windows 95/98 makes a rather poor desktop because the user can easily manipulate it (it has no concept of file permissions or defining user roles), you can protect the surface with user policies (such as removing network neighborhood from the desktop) but you cannot protect the underlying system effectively. Most corporations invest in anti-virus software, and do keep it relatively up to date, but when email viruses can spread around the globe in a matter of hours (as opposed to days, weeks and months in the days before the Internet and mass access to email) getting software updates in time is challenging (almost impossible one might say). On the other hand UNIX platforms in corporate environments tend to be servers, which are reasonably well protected (and I can guarantee user access is limited), or desktops for engineers, scientists and other "serious" types. One disadvantage typically is that the environment in a corporation tends to be more uniform, so if a virus does manage to slip in chances are it will run rampant quite quickly.
|Feature / OS||Windows 95 / 98||NT 4.0 / Windows 2000||Linux / UNIX in general|
|Filesystem security||No||Yes - ACL's, but many programs such as Microsoft Office, Corel, etc. require read/write access to sensitive directories, disabling this access can break some programs.||Yes - User / Group / Other, most important directories and files are protected by default against users.|
|Memory protection||Not really||Yes, has problems||Yes, some problems|
|Chroot for network services||No||Minor, usually unused||Yes, some problems (if the process is chrooted, but still running as root it can probably break out)|
|Run processes as an unprivileged user||No||Yes, has problems||Yes|
|Software Installation||Done by users typically||Should be done by administrators only, but many done by users||Almost always done by administrators, users can install software into their own directories (typically)|
|Anti-Virus scanning software||Yes||Yes||Yes|
|File integrity tools||Yes||Yes||Yes (including many free ones)|
|Overall resistance to viruses||No resistance, with anti-virus software decent resistance||Some resistance, with anti-virus software decent resistance||Good resistance, anti-virus software adds little (good for protecting clients machines however, such as email users).|
There is some good news (not a whole lot though). If a user runs a program as a normal user account chances are it cannot write to system binaries. This significantly decreases the effectiveness of viruses delivered via email or other data sources since they can only modify a users files and not infect the system. The reason this is so effective in Windows is that default file permissions in NT are everybody full control (and many sites do not tighten this), and of course Windows 9x has no file permissions. The flip side of this is that most Linux machines have at least one (or many) local root exploits. Examples include Perl, mail, Sendmail, the Linux kernel itself, and much more. Unless an administrator keeps the machines up to date a sophisticated virus could exploit a weakness and modify system files, or install trojans and backdoors. Unfortunately most machines are not kept up to date very well, and even if they are there is a window of opportunity between vulnerabilities being reported, and vendor upgrades being issued (although Linux has some of the lowest averages, in some cases <24 hours). This makes writing an effective virus for Linux harder, however it is not much more difficult then writing a virus for Linux.
The best defenses against viruses are as follows:
So back to "Remote shell trojan". In order to be infected by it you must run a binary infected by it, as root. Most binary Linux software is typically signed by the vendors that produce it, and a quick check of the signature would reveal if the package were changed or not. In addition this "Remote shell trojan" cannot replicate across networks, it cannot send itself out as an email attachment, or hunt for and infect network shares. On the other hand the Code Red worm will infect any NT or 2000 machine that has a default configuration without sufficient security updates (estimates run from 300,000 machines and up were infected by Code Red). While the number of UNIX, and Linux viruses will of course increase, but I doubt we will see the explosion that the Windows world has been suffering in recent years. The argument that "the increasing popularity of Linux (and UNIX in general) will mean more viruses" is correct but only in a limited way. The general usage habits and layout of the system will defeat the majority of viruses quite effectively.
A virus scanner doesn't do you any good if it's not somewhere along the path the virus takes to get into your network, onto your machine and then executed. When deploying antivirus software there are a number of factors to consider:
The most obvious place to install antivirus software is on user's workstations. The benefits of this are numerous:
There are also problems with placing the antivirus software on workstations:
Antivirus software should be loaded when possible onto workstations, especially for mobile users with laptops. Consideration should also be given to purchasing Antivirus software for people that telecommute from home, even if you do not load antivirus onto corporate desktops it is extremely important to load it on user's home machines. The reason for this is that home machines will be accessing the Internet and are probably not protected by any form of antivirus software at the ISP's end. Additionally since users are wont to use the same passwords for home machines and machines at work, as well as logging in to machines at work it is important to make sure these machines are not compromised.
However the disadvantages are also significant:
The last method is to use a product like Trend Micro VirusWall which scans network traffic for viruses and acts as a proxy for a number of services. The problem with these is that they are relatively easy to bypass for sophisticated attackers, so if someone targets a virus specifically at you they can probably get it through. In one case NAI's add-on antivirus product for their firewall actually opened it up to a remote attack where you could gain root (i.e. full administrative) access to the firewall running the software. Obviously having an attacker in control of your virus scanning firewall is a bad thing. However if they work as advertised it allows you to scan all traffic coming into and leaving your network from a few gateway locations (most corporate networks have one or two access points). Of course if someone manages to get a virus into the company (say an infected diskette) it can then spread like wildfire since there is no internal protection. To summarize the benefits:
And the disadvantages that are present:
Probably the best solution is to use a combination of workstation and server antivirus software. Incoming email should definately be scanned for viruses, especially if you have many clients using Windows mail programs (especially Outlook). Software needs to be placed on workstations since users can (usually) browse the web and download content, if users visit secure (https) sites then any gateway scanning will be ineffective since the content is encrypted. Users are also in the habit of using removable media such as floppy disks, cdroms, zip disks and so forth which they may also be using on unprotected home machines. Antivirus software should be installed on file servers if possible, and scans of user's files should take place in real time if possible (although I know of none that currently do this apperently there is work being done by Samba and Linux programmers to make this possible), barring this a daily or twice daily scan will help significantly. Scanning backups is also a good idea, as new viruses may make it into your network before your antivirus software is updated. As always multiple layers of defense that have no single point of failure are the best practice.
Software being disabled, intentionally or accidentally is a problem. Sometimes users will simply start right-clicking on stuff int he taskbar to free up memory, or do a ps listing and kill off commands that don't look critical. Sometimes software just stops working, i.e. fails to start at boot time, or is not called properly by a helper program such as AMaViS which scans email. If possible you should disable the users ability to disable the software. On a unix workstation make sure the files are owned by a special user (vscan, or perhaps root), this way the user cannot kill the process or otherwise disable it (unless of course the user has local root access). If you must give the user local root access consider something like a "hidden" cron job to restart the process, in case the user accidentally (or otherwise) disables it. You should also make it clear in your security policy that disabling security related software (virus scanners, firewalls, etc.) is a strict no-no. You can also back this up by periodic testing of the software. You don't need to send a live virus (this is always a bad idea), instead you can send one of a number of test signatures, for example eicar.com (I would put the signatures in this webpage except it would set off programs like Norton Anti-Virus). You can find eicar.com online at many sites, it's about 60bytes in size (i.e. tiny).
Software not being updated properly (or at all) is another major problem. If you can automate the process you should do so, at least make it so you can download the update once to an internal server and then have all the clients update from it quickly. The speed at which a modern virus propagates (especially via email) is nothing short of amazing. Once you update the software you may need to reboot the machine (this is primarily a Windows issue and not needed in most UNIX systems). In most cases you will probably need to restart the scanner software so it rereads it configuration and data files, a simple kill -HUP will usually do the trick. In some cases virus scanners have a main data file and then a directory for updates (consisting of individual files for various new signatures/etc.) in some cases software that uses the anti virus scanner to scan email may not pick up on the updates directory, meaning any new viruses will slip past easily. In this extreme case you may want to ensure this doesn't happen by sending a live virus through the virus scanner, be careful however because if you virus scanner doesn't pick it up, and the virus manages to get loose accidentally you are in deep trouble.
Bypassing software is another problem. If you install an antivirus firewall, and scan all traffic to and from the Internet it is still possible for users to slip a virus in (intentionally or not). The simplest problem would be an SSL based website, the virus scanner can't scan encrypted content, so the virus will get past, encrypted email is another example of this problem. If you have a dial in pool of modems you better make sure there is a virus scanning firewall in between them and the network (having a firewall in between is also a good idea, preferably one that requires authentication). If users are allowed to use removable media (floppy disks, CD-ROMs, zip disks, etc.) then they can inadvertently introduce a virus onto the network. The best way to deal with this class of problems is to put antivirus scanning packages on workstations, firewalls, email servers and so on. Preventing users from using removable media is another option and can also prevent information leakage/theft (i.e. users copies critical files to floppy, takes it home, accidentally leaves them exposed and someone copies them). In UNIX you can easily address this problem by setting proper file permissions in the /dev/ directory, however it is trivial for users to email files out, so this is not perfect (but you can at least keep copies of outgoing email for inspection).
Lastly even with all of the above in place it is possible for a virus to sneak in and stay resident. If possible you should conduct "sweeps" (i.e. scan all the files) on workstations and servers periodically, weekends at 2 am are a good time to do this. I have seen several universities now with public kiosks equipped with ancient 486's painted bright red running Linux and anti virus software, the user put a floppy disk in the drive, hits enter and the disk is scanned. This type of service is ideal since it is anonymous and easily accessible, there is a certain amount of stigma attached to having a virus and it can be very embarrassing (how can you be so stupid as to have got a virus on your computer?). There is also work being done by companies like Norton to develop a web based virus scan, you simply go to a web page, download a small program and it scans your system for common viruses (i.e. the current ones). There was even talk of making this a requirement for customers/etc of various online services, however the idea of having to submit to a mandatory scan by third party software, over which you have no real control, is rather unappealing to most users.
Like any security technology antivirus software needs to be planned out, implemented carefully and most importantly maintained properly. Failure to do so will render it useless, perhaps worse then useless because you believe you are protected when in fact you are not. Antivirus software is also not a complete solution on it's own, it is simply part of an overall security framework, it is complimented by firewalls, IDS systems, access controls and so on.
Last updated 4/10/2001
Copyright Kurt Seifried 2001