Cover V06, I06
Article
Figure 1

jun97.tar


UNIX Security Auditing: A Practical Guide

Jack Maynard

Pick up any trade magazine these days and chances are it has an article regarding computer security. Some are mostly hype, while others provide a realistic view of the problems facing the industry. Regardless of whose viewpoint you choose to believe, the conclusion is the same - the computing world is not as safe as it once was. The concept of open systems has changed dramatically over the past 20 years, from what was once an open exchange of information and access, to something more restrained. Today, a strong information security program is a must. Having a security plan in place is a good start, however you must also verify that you are maintaining an environment with a high level of protection for your computing resources. UNIX security is a complex task, composed of many parts. Mistakes are easily made that can hinder you from maintaining a secure environment. Over time, permissions, configurations, and new software installations can leave a computing environment insecure, and the consequences for mistakes can be costly. War Room Research of Baltimore, Maryland reports that 67% of companies who reported a computer break-in paid more than $50,000 to recover. If that seems steep, consider that the same report also said that 27% of those companies paid more than $500,000 to recover. Clearly, this is a case for verification of your security infrastructure, and one method is the security audit.

When to Audit

Dan Farmer, recognized UNIX security expert and author of such programs as COPS and Satan, recommends the following timelines for conducting security audits:

  • before you go live

  • scheduled

  • emergency (after a break-in)

    Before you go live: One of the problems that contribute to break-ins involves sites that do not implement or enforce procedures and standards when installing new hosts on their network. By auditing these systems before going live, you can eliminate the security problems that exist in new systems. Never trust the standard, out of the box security of your specific UNIX vendor. Hardening these hosts to conform to strict security standards and eliminating known vulnerabilities in binaries, access control, and user authentication will go a long way toward reducing your risk.

    Scheduled: Regularly scheduled audits, when conducted in a thorough manner, will verify that you are maintaining your security standards and greatly reduce the risk of a security incident. Site size, criticality, complexity, and staff should all be considered when deciding how often to conduct scheduled audits. Use the following general guidelines for conducting audits on a scheduled basis:

  • individual hosts audit every 12-24 months

  • large network audit every 24 months

  • small network audit every 12 months

  • firewall audit every 6 months (or less)

    Emergency: If you have experienced a security incident, you must now determine the extent of the damage. An audit can help you, however, it is extremely difficult to do without the assistance of integrity checking software in place prior to the incident. The UNIX operating system consists of a large number of files and directories (everything in UNIX is a file). Integrity programs such as TAMU Tiger or Tripwire, when implemented before you experience a problem, can assist with identifying changes in file and directory permissions, ownerships, file characteristics, and modifications to operating system binaries.

    Develop Your Security Policy

    The word audit means "to examine with intent to verify." And, a security audit is an attempt to verify that the day-to-day operation of your computing environment is in alignment with your information security policy. A security policy should be the baseline for your audit. It is the standard against which your configurations are measured. Until you have a comprehensive security policy, you can't measure whether you are meeting the goal of maintaining a secure environment. Information security policies vary from site to site, while some sites have no written policy at all. If you already have your policy in place, then an audit will verify that your operations are in compliance with your policy. If you have no policy, you can audit using what are considered "best practices," but you will need the assistance of someone who is familiar with what those are. Many security books outline a checklist approach to security, but this really attacks the problem from the wrong direction - the bottom up. Any security implementation that does not address the issue from a process driven approach is a patchwork attempt at best, and will fail to provide comprehensive security.

    Figure 1 outlines a process driven, top-down approach to implementing security. If you have no policy, then you need to start with a "needs and risk analysis." This identifies your business needs, computing assets, and the possible risks to those assets. From there you can begin to assess and develop technical and operational solutions to meet your business needs and protect your computing assets. Documentation of how you will implement these solutions becomes your information security policy. There are many resources available to assist you with developing a written policy. A quick way to bring this policy up to speed is with a copy of Charles Cresson Wood's book entitled Information Security Policies Made Easy. The book and accompanying diskette come with 730 prewritten policies, accompanied by explanations for each. This book covers nearly every area related to information security. By using the template policies provided, you can quickly customize the policy for your particular site. Send email to:

    charles@baselinesoft.com

    for information on how to obtain a copy. Another information resource is the freely available RFC1244, Site Security Handbook, published by the Internet Engineering Task Force, or IETF. RFCs are "Request for Comment," and are usually the precursor to a defined Internet standard for a particular technology. This 100-page FYI document helps system administrators prepare site-specific security policies. It provides information on policies for access to computing resources, procedures to prevent security problems, and recommendations for security incident handling. You can find it on the Web at:

    http://ds.internic.net/rfc/rfc1244.txt

    or

    ftp://info.cert.org/pub/ietf/ ssphwg/rfc1244.txt

    Get Permission

    Audits should be performed as part of your official job capacity. If you are auditing a system just to show management how insecure your systems are - beware. There have been cases of well meaning people exposing security weaknesses whose efforts were not met with appreciation, most notably the case of State of Oregon vs. Randal Schwartz. See: http://www.lightlink.com/fors for details.

    Always get permission first, or if asked by management, get the request in writing or an email. Sometimes computing resources cross management or department boundaries. If the scope of your audit covers more than one manager, get permission from each manager or department head involved. If you are a consultant, have the client sign a form to indemnify you for any legal action. The reasons for this are very basic; security auditing has the potential to uncover policy violation or even criminal activity. Should this information result in termination of employment for any individuals, it is important to have a legal umbrella shielding you from any fallout. If you are a consultant considering performing security audits, there are some additional requirements to consider. In the course of an audit, proprietary information will be exchanged regarding passwords, configurations, etc. that need to be protected. A non-disclosure agreement should be executed between all parties to ensure protection and to authorize access to the kind of information needed to conduct the audit. Once the proper authorizations have been obtained, you are ready to move on.

    Define the Scope

    Before you can verify something, you must know what you are verifying. Depending on the size of your company, your audit may cover a broad scope or may be very narrow. It's important to identify the target and the boundaries of what you intend to audit. A comprehensive audit should cover each section of your information security policy. The most common areas of an audit include:

  • physical security

  • host servers

  • network services

  • firewall

    Once you have identified the scope of the audit and have secured the proper authorizations, you can begin. As stated previously, an audit is a procedure for comparing day to day operation with your computing policy. This can be quite a tedious task without tools to assist you, so here are some of the more common ones.

    Identify the Tools

    Security auditing tools fall into two types - commercial, or common off the shelf (COTS), and contributed, or public domain. Some are useful for auditing, while others assist in the ongoing security of your systems and network. Some of the categories are:

  • host-based auditing tools

  • network-wide auditing tools

  • network traffic analysis tools

  • security management tools

  • firewall and perimeter security tools

  • encryption tools

  • authentication tools

    Ask anyone that's been around UNIX for awhile and they will typically tell you - if you need a tool, write it yourself. UNIX provides a variety of commands, utilities, and languages that assist in developing the right program for a given need. This is especially true for security tools. There are more contributed tools for information security than perhaps any other subcategory of system administration. However some of us have neither the time nor the skill to write a comprehensive set of security audit tools. Many of the contributed tools are written by well-respected security specialists, while others are from people you have never heard of. This raises a fundamental question. Who can you trust? You must be able to trust the results of your audit tools. To do this, you must verify that all tools used in an audit have not been tampered with.

    Some companies have policy that prohibits the use of contributed software. If you are not restricted to COTS solutions, there are many respected contributed security tools that are considered "industry standard." The source code to most of these programs is available for inspection, which reduces the risk of a Trojan horse program existing within the software. If you are particularly paranoid, you can examine the source yourself. If you trust the author and the distributing organization, then the integrity of these programs, which are distributed mostly over the Internet, can be validated with the assistance of digital signatures.

    A digital signature indicates whether a file or a message has been modified. The type of digital signature most often used is the message digest. A message digest (also known as a cryptographic checksum or cryptographic hash-code) is a special number produced by a function that is very difficult (if not impossible) to reverse. Using a message digest program such as MD5, generating 128 bits of output, there will be 1.7 x 1038 possible input values of the same length to try before finding an input that generates the correct output. Another common strong authentication method used to verify file integrity is PGP, Phil Zimmerman's Pretty Good Privacy program, which uses public key encryption to protect email and data files. The authenticator uses his private key to encrypt a digital signature, which anyone else can check by using the authenticator's public key to decrypt it. These methods provide a means of strong authentication for verifying the integrity of the audit tools.

    There are many sources for obtaining contributed security tools on the Net. Perhaps one of the most respected is the Coast security archive, located at the Purdue University Dept. of Computer Science. Coast stands for Computer Operations, Audit, and Security Tools. You can find it at:

    http://www.cs.purdue.edu/coast

    or

    ftp://coast.cs.purdue.edu/pub/tools/unix

    According to Coast, they have the largest single archive on the Internet of papers, tools, standards, reports, mailing lists, and other information related to computer security, law, incident response, and information protection. I have used this site for some time and found it very useful. This site should be your first stop for information and resources regarding information security. If you would rather use vendor-developed solutions, there are many choices. A variety of commercial products are offered by various security vendors. Internet Security Systems, which is located at:

    http://www.iss.net

    offers a complete range of Internet, network, firewall, and host scanners known collectively as "SAFEsuite." Another resource is Datalynx at:

    http://www.dlxguard.com

    They offer several products including "Guardian," "Stalker," and "suGuard." Guardian allows you to control account and access control, and Stalker provides system monitoring to tell you who did what, when, where, and how. suGuard allows you to allocate system responsibilities without revealing sensitive passwords. If you have HP-UX systems, take a look at:

    http://www.hp.com/go/security

    These are a few examples of many companies offering commercial solutions. Once you have reviewed the various tool choices in contributed and vendor software, obtain the tools for each category you plan to audit (physical, hosts, network, firewall) that you feel best address your specific needs.

    Conducting the Audit

    A good place to start when conducting an audit is by interviewing a cross-section of users and technical and management staff. The intent of the interviews should be to develop an understanding of how these key individuals believe security works, and how they are implementing security policy. For example, by interviewing users on how often they change their passwords and whether they understand the rules of password composition, you can gain insight into the understanding users have for security policy. To obtain full cooperation, it should be made clear that there is no penalty for incorrect answers and that no names will be used in the audit report. By interviewing several users, it will become clear whether they understand published policies, and whether these policies are being communicated clearly as a function of their job responsibility. One of the recommendations of the audit report may be that security awareness training for users needs to be improved. A similar interview of management staff might ask if the manager communicates to the system administration staff when an individual who reports to them leaves the company, or goes on extended leave, thus allowing for the deactivation of the user's accounts. By the time you complete the interviews, you will have a pretty good understanding of what you will find when you audit the hosts and networks.

    I defined four key areas to examine during an audit. They were physical, hosts, networks, and firewall. Although it's impossible to cover every item you might want to check, I will discuss some of the more common ones.

    Physical Security

    This area deals with physical access to computing resources. Bridges, routers, hubs, systems, and other network components are all prone to abuse if they are easily accessible. All computing resources should be accessible by authorized individuals only. Access to systems and consoles should be restricted to a secure area. Keys or codes should be changed regularly and also changed whenever anyone with a key or a code leaves the company. Wiring ducts and cable tracks should not be easily accessed. Someone could easily tap into your network backbone and "sniff" password or other critical information off your network.

    Another area subject to abuse is information stored on backup media. Anyone with access to these media could reinstall the backups to an offsite machine and recover the data stored there. Sensitive information on computer printouts or discarded magnetic media should be thoroughly shredded or destroyed. A common technique for obtaining inside information is called "dumpster diving," the practice of sifting trash from company waste bins in the hope of finding password or network information. Your site should have secure bins for disposal of sensitive information. The contents of these bins should be shredded to prevent the accidental disclosure of this type of information.

    Hosts

    Host security is primarily concerned with authentication and access control. This is an area that over time grows weak. Most UNIX systems are configured to use what is known as Discretionary Access Control, which allows the owner of a resource to assign various permissions (read, write, and execute) for the three categories of users, known as: user (the owner of the file), group (the users in the group to which the file belongs), and other (all other users). Most host systems are compromised through a series of increasing privileges. All it takes is one user with poor file permissions or weak passwords to allow a would be intruder to gain a foothold into the system. From there the intruder can increase privileges through weak access control, until ultimately he gains root access. Some of the key areas to audit with respect to host security are:

  • password composition

  • password file format

  • group file format

  • device file permissions

  • system logs

  • SUID and SGID programs

  • ownership and permissions in root path

  • default file creation permissions

  • system start up programs

  • home directory permissions

  • permissions of scheduled programs

  • vendor installed software

  • unsecured modems

  • known bugs in binaries

  • vendor patches

    Network Services

    The motivation for network security is simple. It protects your hosts from unauthorized access and network attacks. It also protects you from flawed protocols that have been exploited in the past. Your policy should define which services are allowed. All other services not explicitly allowed by policy should be denied. Some areas to audit with respect to network security include:

  • network services and configuration files

  • NFS exported file systems

  • NIS

  • DNS

  • Windowing systems

  • ftp and tftp configurations

  • host equivalency and trust relationships

    Firewall

    A firewall is designed to be a perimeter defense encircling your trusted network. Traditionally, firewalls have been the demarcation point between the inside trusted network, and the outside world. Many times the strategy has been to harden the security of the firewall machine to decrease the need for security on the inside trusted network. More companies, however, are realizing that more security incidents occur from inside their trusted network than outside. This has led to the increased use of firewalls to compartmentalize sensitive departments, such as research and development or financials. In any case, particular attention must be paid to the security of the firewall system. Some firewalls are simply packet screening routers. This is not an ideal solution, but it can be a sufficient deterrent in some situations. If you are using a screening router, CERT recommends filtering the following services:

  • DNS zone transfers - port 53 (TCP)

  • B>tftpd - port 69 (UDP)

  • link - port 87 (TCP)

  • SunRPC and NFS - port 111 and 2049 (UDP/TCP)

  • BSD UNIX "r" commands - ports 512, 513, and 514 (TCP)

  • lpd - port 515 (TCP)

  • uucpd - port 540 (TCP)

  • Openwindows - port 2000 (UDP/TCP)

  • X windows - port 6000+ (UDP/TCP)

    There may be other ports that should be filtered depending on your policy. Because of the variety of firewall implementations, I cannot discuss everything that should be audited with respect to firewalls, but several principles hold true regardless of the implementation:

  • pay attention to ports and services configured

  • disable all nonessential services and programs

  • remove all nonessential drivers from kernel

  • remember the firewall is just the front door, other services (modems) may allow entrance from a side door

    Document Your Findings

    Once you have completed the investigative phase of the audit, you must document the results of your findings. This documentation should comprise both what is working well and what is broken. By summarizing your successes, you validate that your investment in security development and training is paying off. You must also document where your findings differ from your policy. Because of the sensitive nature of an audit report, the initial distribution should be limited to management only, and then only on a need to know basis. All copies should be numbered and signed for. Your audit may have uncovered serious policy violation or even criminal activity. In this case, management must decide how best to handle the incident. Your security policy should also have a section dealing with incident response. After the initial report has been review with management, action items can be assigned to the various support staff and system administrators to perform corrective action. The audit report should never be stored electronically on a system without some type of strong encryption. All hard copies of the audit report should also be kept secure. Over time, as you perform more audits, your previous audit reports will provide valuable information on how to improve policies and procedures.

    Conclusion

    The key to maintaining a secure computing environment is consistency. By being consistent in how you implement security on your hosts, network, and physical environment, in conjunction with regular audits to catch the inconsistencies, you can greatly reduce your risk of a security-related incident. Remember that audits should not be a substitute for regular review of your log files and keeping track of who is accessing your network services. An active review of log files on a daily or weekly basis will allow you to detect unauthorized activity long before a scheduled audit.

    Bibliography

    Garfinkel, S. and Spafford, G. 1996. Practical UNIX and Internet Security, pp. 167-168. O'Reilly & Associates, Sebastopol, CA.

    Bernstein, T., Bhimani, A., Schultz, E., and Siegel, C. 1996. Internet Security For Business, pp. 77-82. John Wiley & Sons.

    Wood, C.C. 1996. Information Security Policies Made Easy. Baseline Software.

    Site Security Policy Handbook Working Group, 1991. Site Security Handbook. RFC 1244, Internet Engineering Task Force.

    Pipkin, D.L. 1997. Halting the Hacker, pp. 88-90. Prentice Hall.

    Wietse, V., and Farmer, D. 1996. Security Auditing & Risk Analysis. Internet.

    About the Author

    Jack Maynard is a systems software engineer for Hewlett-Packard Co. in Seattle Washington, where he specializes in Internet and network security. Jack can be reached via email at jack_maynard@hp.com.


     



  •