A Sys Admin's Toolkit
Think back to the days when you were first given complete autonomy of a UNIX system. Now visualize the number of computers or networks you administered. Although everyone's experience is unique, many of us have struggled with the tasks of developing and implementing standards within our company's data center. The purpose of this article is to review one topic within the area of data center standards -- software utilities for systems administration. I'm not going to suggest that you run to a prompt and begin downloading all of the software mentioned in this article because everyone's needs are different. These are programs that I've used over the years on various projects, some sparingly, others daily. The article is organized into three parts: a list of software, suggestions for setting up a local software repository, and tips for downloading.
SA Tools -- The Administrator's Toolkit
In an interview a few years back, someone asked me what's UNIX?. After an extended silence, I fumbled through a textbook answer, saying that UNIX is a scalable, multi-user operating system, etc. Even though this may have been a correct answer, it certainly was not what the person was after. In retrospect, I believe he wanted to hear what this operating system meant to me and about my various experiences that led to my definition of UNIX. What's this got to do with an administrator's toolkit?
UNIX is basically an outgrowth of years worth of programmers enhancing their workbenches. There was even one UNIX release called PWB (Programmer's WorkBench). As these programs and utilities gained popularity over time, many have been incorporated into the standard OS. With all of the recent commercialization, this practice has decreased in breadth. However, the idea of a programmer's workbench and free (supported) software that is continuously developed globally still prevails in many areas. Given this, it is well worth any programmer's or administrator's time to be aware of what's out there. Not only can this software make your life easier and improve your performance, but it may someday become part of the standard operating system you support.
Part One: Software, Location, and Supporting Sites
gzip -- Program for compressing and decompressing files. ftp://gatekeeper.dec.com/pub/GNU/gzip/gzip-1.2.4.tar.gz [http://w3.gzip.org/]
gcc -- GNU compiler collection for C, C++, Objective C, and other languages. Refer to this site regarding platform-specific issues when building gcc:
Pre-compiled versions of gcc:
http://sunfreeware.com (for any other versions) http://www.rge.com/pub/systems/aix/bull/out/ http://hpux.cae.wisc.edu/hppd/hpux/Gnu/gcc-2.95.2/ [http://www.gnu.org/software/gcc/onlinedocs/]
lynx -- Text-based WWW browser.
make -- Maintain, update, and regenerate related programs and files.
Perl -- Practical Extraction and Report Language.
[www.perl.org & www.cpan.org] CPAN (Comprehensive Perl
lsof -- Lists all open files for running UNIX processes.
Tcl & Expect -- Tools for automating interactive applications.
Fastpatch, do-patch, and fix-modes -- Perl scripts that install Solaris patches faster than patchadd (also has an undo switch).
PatchReport -- Perl script to assist in the automation of Solaris patch installation. Requires the addition of several Perl modules, libnet, Data-Dumper, MD5, and IO.
Shellutils -- Enhanced versions of standard OS programs as well as additional (handy) programs.
osh (Operator Shell) -- Setuid root, security enhanced, restricted shell.
sudo -- Restricted Root Access Utility. Allows a sys admin to give limited root privileges to users and log root activity.
qterm -- Utility to identify terminal types.
http://www.magnicomp.com ($2 per machine for commercial use)
rsync -- Faster, flexible replacement for rcp.
rdist -- Remote file distribution program.
Glimpse -- Search quickly through entire file systems (includes agrep, which is an extended grep program).
procmail -- Sorts (and handles) your incoming mail to do just about anything you'd like. It also has numerous other capabilities (such as create mail-servers and mailing lists; can be individual or system-wide mailer).
readline -- Utility that offers a command-line editing interface with programs. (At one point ncftp required this library.)
Psutils -- Page imposition tools for PostScript files.
txt2pdf -- Perl script that converts text files to PDF format. http://www.sanface.com/txt2pdf-2.5.tar.gz
Python -- An interpreted, interactive, object-oriented programming language (often compared to Perl).
sysinfo* -- Show system information and configuration.
http://www.magnicomp.com/ ($10 per machine for commercial use)
memconf -- Perl script that displays memory modules installed in a Sun system.
top -- Display processes having highest CPU usage.
Big Brother -- Web-based UNIX system and network monitoring and notification system.
RMCmem -- Loadable kernel module on Solaris that uses the /proc interface to look at the memory allocation of processes
and the UFS buffer cache.
proctool -- Performance monitoring for Solaris.
SE Performance Toolkit (SymbEL Language) -- Performance monitoring/extendable tuning toolkit for Solaris (Virtual Adrian).
traceroute -- Trace the route of IP packets.
ntop -- Fairly new utility (based on top command output) to show (and capture) network utilization. Future objectives includes improved control over network services/programs.
libpcap -- System-independent interface for user-level packet capture. Provides a portable framework for low-level network monitoring (applications such as ntop and Argus require it).
pping -- A TCP port pinger.
mrtg -- snmp agent written in Perl that runs from cron.
netcat -- Reads and writes data across network connections.
nfswatch -- Monitors local network's NFS traffic.
tcpdump -- Tool for network monitoring and data acquisition.
sendmail -- Send mail over the Internet.
wuftp -- ftpd program from Washington University.
ncftp -- Browser (gui and text-based) program for the File Transfer Protocol, with added functionality.
JetAdmin -- HP network printing software.
Samba -- Provides file and print services to SMB/CIFS clients (such as Win95 & NT machines).
zmodem(rz,sz) -- Protocol/program that transfers files quickly and surely under real-world conditions.
netterm -- Improved terminal emulator. Includes support of zmodem for transferring files.
Apache -- http/Web server, open source product.
Lily -- A readline-based user interface to the Lily conferencing system, which is a real-time chat room type of environment where various professionals exchange information.
Webglimpse -- Index and search quickly through entire www sites.
wget -- Network utility to retrieve files from the World Wide Web, using the http and ftp protocols.
TCP Wrappers -- TCP/IP daemon wrapper package.
logdaemon -- Provides modified versions of rshd, rlogind, ftpd, rexecd, login, and telnetd that log significantly more information than the standard vendor versions.
Titan -- System tightening/securing tool that is made up of scripts.
Satan -- Security Administrator Tool for Analyzing Networks.
Tiger -- Set of scripts that scan for security problems.
MD5 -- Message-Digest Algorithm. A highly reliable fingerprint that can be used to verify the integrity of the file's contents (used prevent Trojan horse attempts).
Crack -- Password checking program for UNIX. ftp://ftp.cert.org/pub/tools/crack/crack5.0.tar.gz
Tripwire -- UNIX file system integrity monitor.
Argus -- A generic IP network transaction auditing tool.
Cops -- Examines a system for a number of known weaknesses and alerts.
ISS -- A multi-level security scanner that checks for a number of known security holes. Originated as a free product, but it has evolved into a much more complex and capable commercial product.
ftp://coast.cs.purdue.edu/pub/tools/UNIX/iss/iss121.shar.Z (free version)
http://iss.net/prod/isb.html (commercial version)
ssh (Secure Shell) -- Remote login program. http://www.ssh.org
autoconf -- Tool for producing shell scripts to configure software source code packages to adapt to many kinds of UNIX-like systems.
rc -- Revision control, automates the SCCS, CVS, and
Part Two: Local Software Repository
Let's face it, ftping does not take a genius. However, for an admin responsible for keeping dozens of servers and workstations loaded, upgraded, and running smoothly, this process can become wearisome. Therefore, I suggest implementing a local software repository, even if there are only a half a dozen systems for which you are responsible. The benefits far outweigh the cost. On my current assignment, there is one system (gadome) with a 600-MB filesystem holding all the required programs, utilities, and patches for more than 30 servers and numerous workstations. It took months of juggling and searching before realizing the need for a single software location. This organized approach for managing software led to a better process for developing and implementing standards. Now, when new systems come through the door, there is a form that is filled out which outlines the machine's purpose and provides a software/hardware profile.
To set up this system, you must first decide on a machine that is on the appropriate network for access. Then set up a 500-MB filesystem (or whatever you can scrape together). Decide on a directory structure that is most agreeable to all those who will be accessing it. The top-level breakdown at our site is: env, printing, X11, gnu, patches, security, docs, network, pc-utils, tools. From there, it branches out in a similar fashion that many ftp sites implement. Next, download Apache and install. The only changes made outside of Server Name and Server Admin were to point the Document Root and <Directory > entries to the new filesystem in Apache's httpd.conf file. One additional suggestion is to put together a cron job to periodically update a ls-lR file in your Document Root directory. Here is a one-line script to accomplish this:
cd <document-root>; ls -lR > ls-lR
By doing this, you provide an up-to-date table of contents for all the software available on that server. It will save time in the long run and is quite convenient when there is haste to get and load software. Another consideration is access. There may be a section within your software repository that requires limited access. One way Apache can handle this is to place a .htpasswd file in the directory(s) where these sensitive programs reside. The following section includes real-life scenarios that take advantage of this server. When planning and implementing at your site, you should take the time to sit with co-workers and discuss their software needs if you are not already familiar with them. Unless your work in a shop with strict union guidelines, informing other appropriate parties of this repository and how to use it has its benefits: less network traffic, offload repetitive jobs from your workload, and make your users more self-sufficient. Use of this server, depending upon the amount of varied software packages at your company and the user community, will either take off company-wide or may just remain an addded convenience to yourself. Either way, it's a win situation. Currently, we have begun to implement images, which include all supplemental software for a given department, bundled into a single (compressed) file. We place this on a gadome (our repository server) under /env/images/<dept. name>. Benefits to approach over a typical file server include wider access and less overhead. The extent to which you can set up and utilize your software repository is unlimited; the significant factor here is that the model is open.
Part Three: Downloading
There are numerous approaches when deciding how to download software. The most immediate decision is whether to download binary versions or the source code. With one exception, I recommend getting the source. The reasons reside in network traffic as well as security -- namely, Trojan horses that I discuss later in the article. The one exception is the compiler itself. Even though I suggest building it for the experience, it can take a fair amount of time depending on how often you download and build applications. The network traffic mentioned relates to not having to download the same software for each OS and version at your site. Instead, download the source once and build it on each machine where it's required.
The how-to of downloading involves which protocol, interface, and program you prefer. The most convenient way to download programs, without question, is through a browser. However, this is only feasible when the ftp site is running an http daemon and has the software in an accessible area (the htdocs area or an entry in their httpd's configuration file allowing it). I made the effort to list such sites when gathering software locations.
Before beginning, there are two things to note. The first is related to organization, and I covered that in the previous section. The other has to do with how kind you are when considering bandwidth. There's no reason everyone else on the network should suffer when you need a bunch of patches or a larger application downloaded. Instead, use one of the various ways to schedule (after hours) and automate downloads -- shell scripts, expect scripts, or ncftp (which has built-in scheduling capabilities). Consider these various scenarios:
New mail arrives, consisting of three different requests for small jobs on the mail server. Surprisingly, I find that Perl's yet to be loaded on that box. In the past, I'd scramble quite a bit on two or three servers where I know I loaded it. Often, depending on the program in question, I'd have to download fresh. However, I currently launch lynx gadome/gnu/perl, then arrow down to the latest Perl source file and hit d to download (enter may also work, depending on your version of lynx). From here, Perl's only four commands away from being on the box.
The operations manager calls and says that the folks on the help desk are having intermittent problems seeing their call tracking database. After getting more details, I begin troubleshooting. traceroute is not there. I launch lynx gadome/ls-lR, followed by a quick search (/) and type traceroute. Bang, a match. traceroute 's there less than a minute later.
An email arrived from someone in the Web group outlining the details of what they need for a project. Among the utilities that come to mind that will help them is wget. I'm not sure whether wget is under the gnu license or not, and I'm certain that it's not on gadome. So, I telnet there and cd to the network/misc directory. I then go to one of the following sites:
and do a search for wget. Once I find out it's a gnu product, I decide not to follow the links returned. Instead, I note the latest release filename and use this one-liner:
lynx -source ftp.gnu.org/gnu/wget/wget-1.5.3.tar.gz > \
After downloading, I create a link for the source into a wget directory.
My aim here was to show how basic it is to set up and maintain such a resource, as well as demonstrate its usefulness. It was not to review the software that I listed. In preparing the list, I first went to my own well. Thereafter, I searched for sites aimed at offering information and resources to UNIX systems administrators. I came across a number of excellent sites. One of the main issues is weeding through outdated software that is no longer being maintained. This effort is a continuous one, and is best managed by keeping up with the latest periodicals, researching and evaluating software, and utilizing sites dedicated to this. The most comprehensive site I encountered was http://www.stokely.com/UNIX.sysadm.resources/index.html. I've also started to reorganize and weed through the multitude of information I've collected through the years and will display this (at least for starters) at: http://www.geocities.com/jbeck695/unix-sysadmin. The software listed above will also be updated to assure the reliability of sites and I will post new software when applicable.
Additional Noteworthy Tidbits
For those who are unfamiliar with the term Trojan horses in the computer sense, it has to do with wily crackers placing their own version of a program readily accessible to those in need. This leads to the critical point of only downloading from sites that have been around for a long time and have established a reputation. Also note, just because you build an application does not assure that it is clean. CERT is one such organization that reports computer security issues, such as this. There was a widespread Trojan horse in the traceroute program sometime back. It seems that someone added a few pieces to the makefile. One of the things that happened upon building was that it allowed root privilege to a particular port then sent an external email informing of this. So, the point of downloading from reliable sources can not be overstated. Additionally, when possible, you should use the md5 program or some type of checksum to ensure the integrity of the files downloaded.
Beyond physical connectivity issues, there are two common errors that can occur when connecting to ftp sites. Both are part of the ftp server package and its configuration. There is a passwd-check program that, for accountability purposes, enforces anonymous users to supply a valid (RFC822-compliant) email address. It can be set up to either warn or prohibit anonymous users from gaining access. The second error arises when the ftpd server attempts to convert (do a DNS reverse-lookup of) the Internet address of the host from which you are attempting to get the file. So, any sites without name servers or misconfigured ones, will be out of luck.
Another hurdle in the task of downloading software is locating it. In the list above I included numerous ftp sites, most of which have been around for quite some time. The full url for one I frequently use is http://www.ns.rutgers.edu/htbin/ \
At times, particularly during DNS outages or slow downs, the need for an IP address of a non-local server is necessary. I often use samspade.org/22.214.171.124.
I'm currently on a project where the primary DNS server and wide area network are both changing frequently. And, as a result, I've utilized this server quite a bit; as well as a 56-k modem hanging off one of my systems. It's always nice to have reserves available when things aren't running smoothly.
- The pping program is a great utility to run prior to running network-related programs, such as lynx. For example, type pping ftp.site 80 before using the lynx-source option.
- A site dedicated to extracting individual files from .tar and .gz files, without downloading, is: http://www.delorie.com/ \
About the Author
Joe Beck has been doing systems administration for six years. He currently works as an independent contractor for Sun Microsystems, on a project for the State of New Jersey (Dept. of Labor, tax redesign) working with Sun Enterprise Cluster and Oracle Parallel Server. He can be reached at: email@example.com.