Questions and Answers
Bjorn Satdeva
CERT's latest advisory is about a security flaw in talkd. This daemon is used by the talk program. In itself, this is not newsworthy; however, it is an example of how some people are continuously looking for flaws in the existing network software and will utilize them as they are found. It is also an example of how poorly written software can be used to gain unauthorized access to machines connected to a network.
I wish I could say that only old UNIX software has these kinds of flaws, but this is sadly not the case. With the explosion of the Internet, there has been an enormous growth in the need for skilled security people. A number of people are using this demand as an opportunity to move into the market as consultants or software vendors without first having the necessary technical background needed to give good advice. In the consulting business, I frequently encounter people who speak with great authority about security but are almost clueless about many of the issues. As one security consultant remarked when we discussed this sorry state of affairs, "If you can spell security consultant, then you are one."
Not too long ago, I attended a meeting at a client site, in which the possibilities of the World Wide Web were explained to a number of users. During the half hour presentation, all the nifty things you could do with Active X, such as dropping directly into spreadsheet or word-processing program, were presented. However, not once during the meeting were the significant security risks of these activities mentioned.
The Web is scary. Users and vendors alike are pushing for new functionality every day. Netscape and Sun are at least trying to do the right thing, even though it is an almost impossible task. Microsoft meanwhile appears not to care about security. This is a calculated risk that has worked for them in the past, but there is a very large difference between what is acceptable on a stand alone desktop machine compared to a networked computer, especially when connected to the Internet at large. It will be interesting to see how the world at large will react when the problem gets so big that no matter how much you try, putting your head in the sand no longer works.
In the meantime, as UNIX system administrators, what can we do? Firewalls may not be politically possible, or we may be stuck with an NT-based firewall. We might be forced to open up holes through the firewall that allow updates of an internal database. In many cases, the underlying security paradigm is absent, and the security that has been implemented is flawed on many levels. To answer my own question: I am afraid there is not much we can do, except educate whenever we get the opportunity.
And now to this month's questions:
Do you know of any utilities that will allow for pinging multiple hosts from a file? I have tried fping, but couldn't get it to compile under Linux. Thank you so much.
No, I don't know of any. However, a competent system administrator should be able to write small scripts to do these types of jobs. In Listing 1, you will find a five minute hack that can do what you asked. It is a far cry from the capabilities of fping, but it will do the job until you find a way to fix the problem you have encountered with fping compilation.
Listing 1
Ping multiple hosts
#! /bin/perl
#$File = "/etc/pinglist";
open( FILE, $File ) || die "Cannot open $File";
while ( <FILE> ) {
next if /^#/;
next if /^127\./;
chop($Host = $_);
$Res = `ping -c 3 $_`;
if ( $Res =~ /3 pac\w* trans\w*, 3 pac\w* rec\w*, 0% pac\w* loss/ ){
print "$Host OK\n";
next;
}
print "$Host FAILED!!!\n", $Res;
A system administrator does not need to be a programmer, but I believe that some rudimentary knowledge of programming languages, such as Perl, is a must.
In the January 1997 issue of Sys Admin, you talked about tracking/req. I ftp'd several tar files, but am not sure where to go from there. The README does not go into installing the programs.
Documentation for the software you find on the Internet is not necessarily always of the same quality you find from a commercial package (although sometimes it can be a lot better). The old saying that there is not such thing as a free lunch is certainly true for free software. In the case of req, if you unpack the tar archive you will be a little further along toward a complete installation. The README file there points to the installation documentation in docs/install. There you will find the information you are looking for. Also, in the archive you will find a copy of the paper from the LISA conference proceedings in a file named req-usenix.ps.
In reviewing some documents about some enterprise management software suites, I have become aware that most of them don't really deal with wrapping ports or filtering traffic. I'm curious as to why? To my way of thinking, protecting the network ports falls right in line with protecting the assets available on machines in the enterprise. I realize that there are packages that do manage network entities, but they do nothing for a host behind it. Looking for comments...
I think you are fairly correct in your statement. However, there are a number of problems that are causing this situation. The biggest problem is that there is a large amount of local policy involved in providing system administration, and it is very, very difficult to develop easy to use system administration software that does not contain a lot of hard-coded policy statements. Another problem is that many commercial system administration-related packages are designed and written by engineers who have never done any real system administration. This in itself might not be so bad, but most software engineers I have met in my over 15 years of UNIX system administration seem to think that because they can read the man pages for the system utilities, they know how to do system administration. Such products may appeal to people in management, but they often fall short when applied in real life situations. Once in a while there is a product that clearly has been designed by system administrators. These products tend to do better initially, until engineering and marketing take control of it.
In regard to your comment about protecting the ports, again you are right. The problem here is similar to what I outlined above. People who think about security in terms of host security and are unfamiliar with the security problems faced when that host is placed on a network, tend to ignore that problem. This is a very human reaction, unfortunately, what you do not know can indeed hurt you. Like you, I wish that there were good system administration tools available which are appropriate for large sites. We are still in the same situation we were 10 years ago in respect to recommending tools to our clients. In many cases, we still rely on publicly available software when giving recommendations for how to improve system administration at a given site. This is unfortunate, but I think it is at least better than recommending commercial software, which sometimes fits like a square peg in a round hole.
I have just read the new SAGE publication on procedures. I know policies are something you've been interested in for a long time. Do you have any comments on this document? It seems immature to me.
I will have to admit that the SAGE Short Topics In System Administration publication, "A Guide to Developing Computing Policy Documents," has been a disappointment. Unlike the first document in the series, "Job Descriptions for System Administrators," this latest one is not up to par. It was written by some well meaning people, but as you have observed, they did not really have enough experience about their chosen subject. In their defense, it must be admitted that policies are a very hard subject. Their first mistake was to fail to realize that a policy is a highly localized document which, in conjunction with the procedures, describes "How we do business here." There are some good details, but on a general level, the SAGE publication fails. When you look at the sample policies, they all tend to be variations on a theme, and they all seem to be based on some simple but technology-centered university policies. Although computer policies by virtue tend to have a technological aim, they should not have the technology as their center. Policies are an instrument to deal with human issues - something that the SAGE folks seem to have lost sight of.
No business or organization acquires computers for their own sake. There is always a purpose for those computers, and therefore the policies. As an example, a large company such as Boeing has a huge number of computers. However, Boeing is not in business just to purchase and run computers. If anything, the purpose of that business could probably be described as being able to build reliable airplanes as effectively and inexpensively as possible. The computing policy documents, like everything else in that organization, must reflect that purpose.
The same will be true for literally every organization you can think of, even in the cases where the computers are very central to the demographics of the business, for example, service providers.
It is unfortunate, but the bottom line is therefore that we still are in need of a good document that describes how to develop procedures and policies. If anybody knows of a good source for this information, please let me know.
Sorry to ask you these simple questions, but I've been told you're the one I should write to. If you know of a mailing list or a FAQ about Internet questions, please let me know. My first question is whether there is any company that sells Web servers. If yes, how much do they cost? And also how long does it take to implement such a product?
One source of commercial Web servers is Netscape, but there are probably other companies that offer such servers, too. If you are running a large Web site, using a commercial server is probably a very good idea in order to be able to get support. For smaller scale sites, there are a large number of freeware servers available. One of the best seems to be the Apache server, but you can find a number of different servers in the system administration archives.
I have a question concerning DNS wild cards. According to the literature, any host for which there is data in the zone file will not be matched by a wild card MX record. Assuming the zone file will have an address record for every host within that zone, each of those hosts will need an explicit MX record - which is why one might want to use wild cards, but they won't work because of the address records? This sounds like suspiciously like Catch-22! What am I missing?
The short answer is "don't use wild cards." The reason is that there a lot of possibilities for confusion when you use the wild card features. Although it looks like attractive shortcut to have a MX wild card that matches all your hosts, it will also match any mistyped hostname, which you probably do not want. So, unless you know what you are doing, and positively know that you need to use the wild cards, then you are better off leaving them out. As in so many places, a shortcut is the longest distance between two points.
About the Author
Bjorn Satdeva is the president of /sys/admin, inc., a consulting firm which specializes in large installation system administration. Bjorn is also co-founder and former president of Bay-LISA, a San Francisco Bay Area user's group for system administrators of large sites. Bjorn can be contacted at /sys/admin, inc., 2787 Moorpark Ave., San Jose, CA 95128; electronically at bjorn@sysadmin.com; or by phone at (408) 241-3111.
|