DataMuseum.dk

Presents historical artifacts from the history of:

DKUUG/EUUG Conference tapes

This is an automatic "excavation" of a thematic subset of
artifacts from Datamuseum.dk's BitArchive.

See our Wiki for more about DKUUG/EUUG Conference tapes

Excavated with: AutoArchaeologist - Free & Open Source Software.


top - metrics - download
Index: T q

⟦73e751d35⟧ TextFile

    Length: 13339 (0x341b)
    Types: TextFile
    Names: »questions«

Derivation

└─⟦4f9d7c866⟧ Bits:30007245 EUUGD6: Sikkerheds distributionen
    └─⟦3da311d67⟧ »./cops/1.04/cops_104.tar.Z« 
        └─⟦6a2577110⟧ 
└─⟦4f9d7c866⟧ Bits:30007245 EUUGD6: Sikkerheds distributionen
    └─⟦6a2577110⟧ »./cops/1.04/cops_104.tar« 
            └─⟦this⟧ »cops_104/extensions/questions« 
└─⟦4f9d7c866⟧ Bits:30007245 EUUGD6: Sikkerheds distributionen
    └─⟦ed5edc051⟧ »./cops/1.02/cops.102.tar« 
└─⟦4f9d7c866⟧ Bits:30007245 EUUGD6: Sikkerheds distributionen
    └─⟦db60b44f1⟧ »./cops/1.02/cops.102.tar.Z« 
        └─⟦ed5edc051⟧ 
            └─⟦this⟧ »cops/extensions/questions« 

TextFile

   
   I polled a security mailing list and got about 40 responses to a
selected number of questions dealing with security; it might be useful
for inclusion on how the net (at least some of the security minded ones)
view security.  The answers to these questions shaped some of the philosophies
of COPS and might be indicative of the type of security tools to be
developed in the future.  My questions start with a number and a ")".

   1)  What kinds of problems should a software security system (SSS)
   such as COPS check for? (Mention specific examples, if you can.)

  Just about everyone agreed that the more things checked, the better.
Some specific wants of items I didn't mention, more or less in the order
of # of requests:

  Some kind of _secure_ checksum method for checking up on binary files.

  Checking binaries for known security problems - sendmail, fingerd,
ftpd, ect.

  Checking the validity of the _format_ of key files rather than merely
checking if they are writable.

  Checking for potential trojan horses; files such as "ls" in a users
account.

  Finding things hidden under mount points.

  Keeping track of accounts in a seperate file from /etc/passwd and
run periodic checks to see if any accounts have been added by any
unauthorized user.
  
  Report unusual system activity, such as burning lots of CPU time.

  Record unsuccessful login attempts and su's to root, when and by whom
if possible.


   2)  Are there any security problems too sensitive to be checked
   by a SSS?  That is, what things should *not* be built into a SSS?

     Boy, this was a landslide.  Over 90% said NO, and not only no,
but basically "Hell No".  The only concerns I got were against password
cracking and problems that could not be easily fixed.  There was also
a small amount of concern about limiting access to root, but most realized
that no matter what, the benifits would outweigh any losses if the programs
were put out.

   3) What should the primary goal of a SSS be -- discovering as many
   security holes as possible in a given system (including bugs or
   design flaws that may not be easily fixed -- especially without
   source code), or merely uncovering correctable errors (due to
   ignorance, carelessness, etc)?

     Another landslide.  Of all the responses, only one person objected
to finding all holes, although a few did say that finding the fixable
holes was top priority.

    One view:

  My use for an SSS is as a system monitor, not as a diagnostic tool.
I suppose the diagnostic version also has its uses, but writing and
distributing such a program is asking for trouble.  I don't see
anything wrong with writing it and distributing only the binaries.


   4)  Do you feel that SSS are a security threat themselves?

     Some dissent begins to show.... It was almost even here, with the
no's beating out the yes's by a single vote.  However, 2/3 of the yes
votes qualified there answer by stating something like "a tool can be
misused" and whatnot.  Here are some typical responses:

Of course.  They point to way for bad guys.  Such is life.
They are a tool.  They have the potential for anything.  The
security threat lies in how they are used....

No, as long as they don't breed complacency. Just by running
a SSS each night should not make you thinks your systems are
secure.

Fire is also dangerous but VERY useful.


   5) Do you think that the SSS should be restricted to be used only
   by system administrators (or other people in charge), or should
   they be accessible to all?

     Here's where the problems start :-)  Everyone wants as many
features as possible, but quite a few of you don't want anyone else
to have it.  Hmm...   Out of 35 responses on this question:
  12 - Yes, only SA's.
  10 - No.
   6 - It would be nice to have it restricted, but... How?
   5 - Have two versions; one restricted, one not.  Needless to say,
        the dangerous stuff should go in the first.
   1 - Restrict only parts that detect bugs/whatever that cannot be
        repaired.
   1 - Argh!  Help!

     Some quotable quotes:

I don't see how it could be restricted.

Admins, etc only. (possibly said because I'm an admin. From an
intellectual standpoint, I would want to know about this stuff even
if I was just a user)

I think the SSS should be restricted to system
administrators with the realisation that others can probably
get their hands on the code if they want to. 

Definitely available to all, SA's can be as lazy as anyone and should not be 
allowed to hide behind a veil of secrecy if, in doing so, they expose the 
systems they administer.

It seems to me that only an "administrator type" will have sufficient
privilege levels to make _effective_ use of such a tool.  Ordinary users
may be able to garner _some_ benefit though, if run on their own files.
If possible, can there be an "administrator" mode and a (restriced/limited)
"user" mode?

(and finally, my personal favorite...)

I think that a check for a hole that can't be closed shouldn't be a part of
the check, if that hole is widespread.  I have no examples of any such hole,
but a weak spot that can't be closed and has no workaround is one of the few
candidates for the security by secrecy concept.  I have mixed feelings about
this, but if I can't fix the hole, I'd rather not have it's existence be
"public" knowledge.  A freely available routine to locate the hole would
spread it's existence far and wide.....(?)
But, if I didn't know about it beforehand then it would be good to have a
tool to tell me it existed.  Gads, I hate moral conflicts!
 

   6) When a SSS finds a security flaw in a system, do you want it to 
   indicate how they flaw could be used to compromise your system, or
   would you just accept the conclusion and apply a fix?

      This question was ill worded and gramatically incorrect, but still
managed to conjure up a lot of comments.  Some thought it was asking if
the system should apply a fix.
      In any case, almost 3/4 said Yes, indicate exactly how to exploit
any potential hole.  As usual, there were a few with reservations about
the info getting out, but.... 
   Here are some of the more interesting comments:

                (Think about this one!)
  *I* would like to know to futher my knowledge of Unix, but more importantly
to make sure that the version I have was not modified by a cracker to
put security holes *into* a system.  (That'd be sneaky :-)

   Security by obfuscation doesn't work.

   By definition, a SSS is a software system, and therefore has bugs in it.
If it reported a problem which would cause quite a bit of inconvenience if
fixed, or would be difficult to fix, then I would be much more apt to make
the fix if I knew how the problem could be exploited.  This is important,
because many, if not most, sites require only a moderate level of security,
and many security holes are fiendishly difficult to exploit.

   We cannot assume that end-purchasers of a system can be as aware of 
the internal workings of a system as the designers of the system (or SSS)
are.  If a security flaw is discovered, the administrators need to be
informed about what changes are necessary to remove that flaw, and what
repercussions they may have.
   [...]
   Imagine a SSS that knew sendmail(8) was a security flaw
allowing a worm to enter systems.  It would report that sendmail is a 
security flaw, please disable it like....  If the vendor had released
a patch, and the SSS didn't know how it, the administrator (in blind
faith to this SSS program) might disable a *very* useful program
unnecessarily.


   7)  Do you think that there is too much, not enough, or just about
   the right amount of concern over computer security?  How about at 
   your computer site?  At other sites?

      The "not enough"s won, but not by much.  I thought that given
the paranoia of a security group, this would be a larger victory.
Lots of people said it depends -- on the type of facility, the size, etc.
Large sites seem to have a healthier view of security (paranoia :-)) than
smaller/non-governmental.  Only 4 or 5 said there was enough concern.
A couple of people mentioned _The Cuckoo's Egg_ as suggested reading
(I heartily agree.)

   More quotes:

  (I don't know if the next answer is true, but I like it anyway!)

  This is really a deep philosophical question---something to talk about
over a few beers at the bar, but not here.

  I think it's a site dependent problem, and all the above are
true: too much, too little, and just right. Computer is not a
"one size fits all" situation. Having offered that opinion, I
think an assessment of my site or other sites is extraneous, and I
will reserve that opinion.

  ... more attention to unauthorized use of the networks.

   8)  Do you think that there should be a ruling body that governs
   and enforces rules and regulations of the net -- sort of a net.police?

     Some of you wondered what this had to do with software security, but
just about everyone answered anyway.  This one scared me!  The "No's" only
beat out the "yes's" by one vote.  Yikes!  Maybe I'm from the old school
of thought, but....  Several people said that it couldn't be done anyway;
a couple mentioned they a CERT-like agency to help out, but not control,
and finally two said that the laws and government were already there to
do this.

   It's there, defacto.  The free market is working pretty well.

   Absolutely. I quarrel with the "net.police" designation, per se, of
course, as do many others. But perhaps something more like a recognized
trade association, and providing similar services. Also, it is time that
the basic duties which must be reasonably performed by a site in order for
it to remain on the net should become a requirement rather than a matter
of individual whim.

   Yuck!  This is very distasteful to me.  It will probably be necessary
though as more and more people participate in the net.  Enforcement will
have to be judicious until secure networking is developed and implemented
generally.

   No.  Aside from the fact that it'd never work, I like Usenet as an
anarchy.  It has some rough edges, but for the most part it works.  What
does this question have to do with SSS-type programs?

   Enforcement will be tough and may hold back legitimate users.  But
we have to start somewhere.  So I suppose that I agree with having
net.police, as long as they don't turn things into a police.state.net. 


   9)  Do you believe that breaking into other people's systems should
   continue to be against the law?

      Only one said "no", and s/he had a smiley following the answer.
But there were some of you who voiced concern that it wasn't really
against the law to begin with.  In _The Cuckoo's Nest_, Cliff Stoll talked
about a (Canadian, I think) case that the only reason the cracker was
prosecuted was for stealing electricity!  Less than a watt or something.
A few of you mentioned denial of services as being a just reason, but
what if they break in only at night, when no one else is on, and they
really don't take anything at all?  Should that be less punishable than
someone who sucks away user CPU/disk/whatever?

   Breakins should be encouraged and rewarded (1/2 :-).

   Yes.  Unquestionably.  However, those laws should not attempt to regulate
inter-system traffic to cause these things to happen.

   Yes - and as a felony in all cases, without exception.

   Yes but murder, rape, robbery... are more important and laws and
sentencing should reflect this. There are some around who want to treat
cracking as a capital crime!

   Yes, from the denial of services standpoint.  I pay $XXX,XXX.XX for a
system, and joe blow slides in and sucks away at those resources, there
should be a nontrivial penalty for getting caught.  Don't behead the guy,
but monetary fines or community service would be just fine.


   I don't know.  I'm not a philosopher.  Certainly causing
damage to others is wrong, including denial of service,
compromising sensitive info, or whatever.  I'm concerned
though that clamping down on young kids will discourage them
from becoming computer geeks.  I think we need to encourage
our young people to become technically literate.  If we
don't become a more expert society we can kiss it goodbye;
all we'll have left is our military solutions, like some
brainless jock bully...

   I'm not sure that it is everywhere - but: Yes.  Should attempting 
to break in be against the law: No.  Is this vague: Yes.

   I did not know that it was. The laws about it have not been tested and 
are vague and unclear. You need to be very clear about what the laws
are going to do. 

   **HELL FUCKING YES** Those of us who started in UNIX years ago have
for the most part *always* respected others!! This I can't stress strong
enough.


  10)  Is your site academic, government, or commercial in nature?

      Just over 1/2 of those that answered claimed university ties,
with about 1/4 being commercial, 1/6 government, a few research sites,
and a couple that were a mixture.  Sites included Sun, AT&T, SCO (Xenix),
the DoD, and the Army, among others.

(Guess where this one came from :-)

Research.  We invented Unix.

Academic with commercial applications.

Primarily academic, but we are part of the government.

Academic, except when collecting student fees *) *)