Security registry settings.

Christopher R. Hertel crh at
Fri Feb 7 08:05:55 GMT 2003

Okay, folks.  Here's where I am on security settings and their impact.  
The settings below are all based on NT4SP6, but similar stuff is (should 
be?) available for W2K and other Windows flavors.

1) MAC Signing:

On the server:

  The parameter EnableSecuritySignature is used to enable and disable SMB 
  MAC signatures.  If enabled (set to one (1) instead of zero (0)), the
  server will sign SMB packets if the client wants to sign packets.

  The parameter RequireSecuritySignature is used to force SMB MAC
  signatures.  The RequireSecuritySignature parameter has no effect unless
  EnableSecuritySignature is also enabled.  If both are set to one (1),
  the server will require that the client use MAC signatures.

On the client:

  Notice that the path is different (Rdr instead of LanManServer).

  Otherwise, this is all similar to the server, if EnableSecuritySignature
  is enabled then the client will perform MAC signing if the server
  supports or requires it.  If both EnableSecuritySignature and
  RequireSecuritySignature are enabled, then the client must use MAC
  signing.  The session will fail if the server does not have MAC signing
  enabled.  (The client closes the TCP connection immediately after the 
  server sends the NegProt Response with the SecurityMode field indicating
  that it doesn't support MAC signing.)

There are docs which state that W/9x cannot do server-side MAC signing, 
but can do client-side.

2) Challenge/Response algorithm:

  The following KB articles are useful references: 147706, 239869.

  The registry variable
  is used to set the minimum challenge/response algorithm.  On W/9x boxes
  the variable is LMCompatibility rather than LMCompatibilityLevel.  (Why
  do they do things like that?)

  Anyway, this is an annoying variable because it does too much.  There 
  are six possible values, ranging from 0..5.  They work like so:

    Client                              Domain Controller (or Server)
    ----------------------------------  ---------------------------------
  0 Default.  Client sends both LM      Default.  DC accepts LM, NTLM,
    and NTLM[v1] Response.              LMv2, and NTLMv2 responses.

  1 Discussion below.                   Discussion below.

  2 Client sends the NTLMv1 response    DC accepts LM, NTLM, LMv2,
    in both password fields (the same   and NTLMv2 responses.
    value twice...I've seen it, it's

  3 The Client places the 24-byte LMv2  According to the docs, at this
    response into the ANSI password     setting the DC still accepts
    field, and the longer NTLMv2        LM, NTLM, LMv2, and NTLMv2.
    response into the Unicode password

  4 The client sends both a 24-byte     The DC does not compare any
    response (probably the LMv2) and    response against the LM response
    the longer NTLMv2 response.         NTLM, LMv2, and NTLMv2 are

  5 Same as level 4.                    The DC does not compare against
                                        LM or NTLM.  Only LMv2 and NTLMv2
                                        are accepted.

  From the testing I've done, the above is pretty close to reality.  The
  setting that bangs me on the head until my feet ache is
  LMCompatibilityLevel = 1.  The docs say that this enables "NTLMv2 
  Session Security", but I can't find docs on that.

  Abartlet tells me that it's specific to NTLMSSP.  My question, at this
  point, is: how do the client and server know to use NTLMSSP?


Chris -)-----

Samba Team --     -)-----   Christopher R. Hertel
jCIFS Team --   -)-----   ubiqx development, uninq.
ubiqx Team --     -)-----   crh at
OnLineBook --    -)-----   crh at

More information about the samba-technical mailing list