Cover V06, I04
Article
Listing 1
Listing 2
Listing 3
Listing 4
Listing 5

apr97.tar


Serious Shell Programming

Venkat Iyer

The aim of this article is to promote Bourne shell programming as an all-purpose alternative scripting language. I will describe some tricks and provide examples and explanations to demonstrate shell's little known and used features. New shell programmers can use the functions described here to ease shell programming restrictions, and experienced shell programmers may pick up a few tips. These examples are also available in source form as described later in the Availability section.

Why Shell

The C shell is without doubt a useful interactive shell, but for programming it is not much use. One major deficiency in the C shell is the absence of functions. The only way to reuse code in C shell is to write each function in a separate file and call each file as required. Doing something serious requires either large scripts with repeated pieces of code or many script files.

Shell is an inseparable part of UNIX. Every UNIX system shipped, from Minix 2.0.0 to Solaris 2.5.1, has shell in it, and most shells seem to be pretty compatible with each other. It is fairly easy to write scripts that will work on a variety of UNIX systems.

Shell is portable. All the examples in this article have been tested on a variety of systems - a 286 running Minix 1.7.4, Solaris 2.5, SunOS 4.1.4, DEC OSF1 V2.0, and HP-UX 9.05. On Minix though, these examples were run under /usr/bin/ash, and on HP, I used /bin/posix/sh. Note that /bin/sh will not work on either of these systems. I think these examples will work with any POSIX-compliant shell.

Shell scripts are typically written by systems administrators or software vendors providing installation scripts for their UNIX-based products. I have written lots of scripts in shell for everything from an issue tracking system to HTML generators.

System Administration

Assume your /usr won't mount because of a typo in your /etc/fstab. Nothing will work - most of the commonly used executables are in /usr/bin. You have a few executables in /sbin, and sh is one of them. If you can edit your fstab using just sh, you will be up and running soon.

Install Scripts

Your install scripts cannot assume that Perl or Tcl is already installed on the target system. Using shell scripts instead of long readmes will make your software more system friendly.

Shell and Only Shell

Any regular shell programming book will tell you how to program in shell with all the standard UNIX text processing tools (like grep, sed, and awk) in place. However, I will discuss programming only in shell. All of my examples use just shell with the built-in functions. This is useful when none of the other utilities are available, but can be useful otherwise too. If you are writing a shell script that executes a loop many times over, then not having a new process for each iteration of that loop will save time. Some of the utilities I present will reduce the need for using external processes. I have seen considerable speed increases when a tight loop of about a thousand times didn't create an extra process.

Input and Output

There is only one command for input in sh - read. There is only one command for output - echo.

Assume you just want to copy one file to another; in this case, you want to copy /etc/passwd to passwd.old in the current directory. You could use cp to do the work for you, but for this article, I don't want to use other executables.

Cat()
{
while read line
do
echo $line
done
}

Cat < /etc/passwd > /etc/passwd.old

read is very versatile. Let's say my ls -al listing looks like this:

drwxr-xr-x  2 venkat  staff     512 Sep 10 16:54 Mail
-rw-r--r--  1 venkat  staff 2177993 Aug 13 09:10 anzd.pdf
-rw-rw-r--  1 venkat  staff     712 Sep 17 15:04 art -> /usr/art
-rw-rw-r--  1 venkat  swadmin   398 Sep 18 10:15 b.html
-rw-rw-r--  1 venkat  swadmin   274 Sep 18 10:08 b.html~
drwxr-xr-x  2 venkat  staff     512 Jun 14 18:32 backup
-rw-r--r--  1 venkat  staff     257 May 23 09:51 backup.notes
drwxr-xr-x  3 venkat  staff     512 Sep 22 11:34 bin
-rw-r--r--  1 venkat  staff  180378 Aug 19 12:48 bip.ps
-rw-r--r--  1 venkat  staff  180393 Aug  7 14:12 bip.ps~

I want to print only the names of the files that have the group id of swadmin. The following example shows how to do this in shell. The example uses the built-in command [, which is the same as test. Please see the man page on test for a detailed explanation of the operators it supports. Here I am just checking for equality. Note that I have quoted both the arguments to test ([) . It is always a good practice to use quotes.

In its default behavior, read splits an input line into fields and assigns one field to each variable in order. The last variable gets the remainder of the line; the "junk" in this example will get symbolic link information (such as -> /usr/art, which I am not interested in). If "junk" were not specified, then the symbolic link information would become part of filename itself.

Here is an example for splitting input into fields:

GetSwAdminFiles()
{
while read perms blks user group size month day dateyear name junk

do
if [ "$group" = "swadmin" ]
then
echo "$name"
fi
done
}

ls -al > /tmp/xx.$$
GetSwAdminFiles < /tmp/xx.$$

You can change the way the input fields are separated by setting the shell variable, IFS. This variable determines how shell splits up fields on input. If you set IFS to something other than a space, then sh splits the fields with IFS separating the fields, instead of whitespace, as shown in the following examples.

If you tried the Cat in the first example, you might have noticed that it strips out leading and trailing whitespace characters. I don't mind losing trailing spaces, but I like the indents on my text to remain. You can set IFS to something that you don't expect to encounter, (e.g., ^\), as shown below. (^\ is the Control-Backslash character).

Cat()
{
OIFS="$IFS"
IFS="^\"
while read Line
do
echo $Line
done
IFS="$OIFS"
}

Output doesn't use the IFS, but has its own quirks. If you do not want to put a newline after a message, there are two incompatible ways to do it. Most shells honor just one. Solaris sh wants a \c in the string; whereas SunOS sh wants a -n after the echo. This function in the library finds out the suitable option for your shell. It assumes that if your shell doesn't understand -n, it will understand \c. You just use EchoN wherever you want to output something without a newline.

unset _NeedEchoBackslashC_

[ "`echo -n ff`" = "ff" ] || _NeedEchoBackslashC_=1


EchoN()
{
if [ "$_NeedEchoBackslashC_" ]
then
echo "$*\\c"
else
echo -n "$*"
fi
}

The most common reason to use output without newline is when you prompt for a value. The following example shows one use. This function, ReadValue, takes the variable name as the first parameter, the default value as the second parameter, and then the prompt. It prints the prompt, waits for the user to type in a value and returns it in the first parameter. If the user types in nothing, the default value is returned. If the user types in "none," then an empty string is returned. Note that you can use "none" as the default to keep the default as the empty string.

ReadValue()       # target default prompt
{
var="$1"
shift
default="$1"
shift
result=""
EchoN "$* [$default]:"
read result
[ "$result" ] || result="$default"
[ "$result" != "none" ] || result=""
eval $var=\"$result\"
}

Regular Expression Matching

Regular string equality, and inequality operators can be done using the [ (test) command. Though sh doesn't explicitly have any regular expression matching operators, it is easy to write functions that can do the equivalent of matches and not-matches, with a regular expression as the second argument.

These functions, Matches and NotMatches, use the case statement, which allows for glob regular expressions as case value. A glob regular expression uses the shell metacharacters, like when shell expands filenames. I use these functions frequently in my shell scripts. These functions return zero on success and one on failure, to conform to sh's notion of success and failure. By writing them this way, I can directly use them in an if statement, as demonstrated in the following example:

Matches()       # source regexp
{
case "$1" in
$2
return 0
;;
*)
return 1
;;
esac
}

NotMatches()    # source regexp
{
case "$1" in
$2)
return 1
;;
*)
return 0
;;
esac
}

Example of RegularExpression Matching

Say that from the /etc/passwd file, you want the list of users whose home directories are not under /home.

Note that in this function, CollectNonUsrNames, I collect all the names into a passed in variable name. Most of the functions in the library set up some variable for return. As a convention, I use the first parameter to the function as the name of the output variable.

Remember to switch IFS back to what it was as soon as you are finished with the job. IFS also determines how shell parses command lines and can get you into trouble, if it is not the default.

CollectNonUsrNames()     # target-var
{
OIFS="$IFS"
IFS=":"
while read user passwd uid gid comment home shell junk

do
if Matches "$home" "/usr*"
then
echo "$user has home $home"
fi
done < /etc/passwd
IFS="$OIFS"
}

CollectNonUsrNames NonUsrs
echo $NonUsrs

Processing Words and Lists

I handle lists as just strings concatenated with spaces between them.

Listing 1 shows various operations on lists. GetFirstWord gets the first from a list. If pathparts contains a list of words, then head will contain the first of those words. Make sure that you do not put $pathparts in quotes. This is one of the rare places where quotes can do harm. If you quote $pathparts, then you will get all of pathparts into head.

GetLastWord gets the last word from a list. The main cause for the work is that you can access only $1 through $9 - to access the rest you have to shift them into that window. If you expect more than a few thousand parameters, then you could add another case to handle a four-digit $#.

ListDel and GetAllButLastword both traverse the whole list and can get slow on large lists.

Although shell doesn't have any commands for string processing, the SplitSep function provides some string manipulation ability. SplitSep is the splitter function; whereas the first function is just a helper. A call is needed so that the new IFS is used to parse the call to the SplitWords function. Note that you can't quote the $3 parameter to SplitWords (see Listing 2).

The example in Listing 3 shows how you can split and join words in a shell script. The first function, GetBaseName, takes a path as the second parameter. It assigns the basename of this path into the first parameter. GetDirName gets the directory name.

Associative Arrays

Lists can get slow pretty quickly. If you really want to handle lots of data and need random access or need to find information based on other information, then lists are not of much use. You have to scan through them serially. Most scripting languages provide associative arrays of some kind. The following routines let you manipulate an associative array:

arrset Lets you set an element of the associative array

arrget Gets you an element of the array given the subscript (index)

arrnames Gets the list of subscripts of an associative array

arrclear Clears out an associative array

arrexist Checks for whether an associative array exists

arrelemexist Checks for whether a particular subscript is preset

arrelemclear Clears one element of the array.

arrelemclear can get slow if the array is big. So don't use it often (See Listing 4).

Example Using Associative Arrays

If you've ever wondered whether there was an easy way to kill processes with a particular name, instead of using ps, grep, and kill, Listing 5 shows a script for you. Let's call it kp for kill processes. This is a script for Solaris 2.5. You might have to change it depending on what parameters your ps puts out. I just copied the read line from the header that ps put out. Examples of usage are:

% kp "*nfsd*" "*mountd" #kill processes with nfsd in them or ending in

mountd
% kp "/usr/local/etc/*" #kill all processes beginning with /usr/local/etc/

Usage Messages I often put comments at the beginning of my scripts. But I often had to repeat this for the usage message. The following function from the library generates a usage message from the comments at the beginning of the file. It prints out all lines from the beginning of the file that begin with a # character. It ignores lines that begin with #! (so that your #!/bin/sh doesn't get printed) and stops at the first line that doesn't begin with a #. This function definitely eases documentation problems. You just write comments in the beginning of your file.

_ProgramName="$0"

DefUsage()
{
BFS="$IFS"
IFS="^\"
while read Line
do
Matches "$Line" "#*" || break
NotMatches "$Line" "#!*" || continue
IFS="#"
read Line <<EOF
$Line
EOF
echo "$Line"
IFS="^\"
done < $_ProgramName
IFS="$BFS"
exit 1
}

Example of Automatic Usage Message Generation

The following example demonstrates the usage of the DefUsage function. Note the use of the || operator. I use this operator often instead of the if statement because it is compact. It means run the command on the right if the command on the left fails.

Avoid the && operator as much as possible - in &&, the right-hand side command is executed only if the left one failed. If either fails then the command fails. If this happens in a script, then the script will exit at that point.

#!/bin/sh
# Usage:
#    deluser <Username|UserId> [-h home-dir] [-notify user]
#
# deluser will delete the user from the password file. If the -h
# option is specified, it will delete the user's home directory as
# well.  If -notify is specified, then user is emailed when this
# script succeeds
#

. shlib

[ "$1" ] || DefUsage

Availability

The functions discussed in this article and others described below are available from http://www.comit.com/.

automatic switching to a better shell on HP

A sophisticated command runner that can display and run commands, run commands for a limited time - look for Command.

Separate a string into fields with any separator - look for SetVarsFromSepList.

Further Reading

The manual page of sh is a complete, although cryptic reference. The manual pages of test and getopt will also be useful.

Tips

1. Avoid pipes; pipes add processes. Also commands in a pipe are run as child processes and cannot modify variables in your script.

2. Use double quotes liberally; only leave them out when you are sure that you want a string to be split into many arguments.

3. Shell variables are all global. Use names in libraries that you wouldn't normally use. I use the convention that t1,t2 are all temporaries. You could use leading underscores. This article contains examples that use commonly used variables.

4. Make a shell library; put frequently used functions into files that you can source as shown in the last example.

About the Author

Venkat Iyer has a Masters in Computer Science, has six years of experience, and is Manager, Software Engineering at Comit Systems, Inc. His interests are languages and EDA tools. You can reach him at: http://www.comit.com/~venkat or at venkat@comit.com.