Too Many Open Files on 2.2.4
Tristan.Ball at vsl.com.au
Tue May 14 23:22:01 GMT 2002
After a little play with a test program, I can say that it's fixed in 64
bit mode, which I expected, I guess.
cc -xtarget=native -xarch=v9
Results in a program that will happily run untill it hit our ulimit max
of 15000 files, using fopen and friends.
I might try samba in 64bit then...
-------- Original Message --------
From: Tristan Ball <Tristan.Ball at vsl.com.au>
Subject: Re: Too Many Open Files on 2.2.4???
To: Jeremy Allison <jra at samba.org>
CC: Samba Technical <samba-technical at samba.org>
I came to the same conclusion, late last night. It may be fixed if you
use Sun's 64bit C compiler, however I haven't fully tested - compiling
samba as LP64 gives a few warnings which I haven't chased down yet. The
include file defines the FILE structure to contain only a 16 element
array of longs when in 64bit mode. Comments elsewhere in the
includes/man pages are basically to the effect that FILE is opaque,
don't touch. None of the 64bit routines supplied for accessing
information in the structure give any indication either of whether more
than 256 discriptors are available...
I'll try with sfio for now, thanks for the pointer.
Jeremy Allison wrote:
> This is probably the somewhat braindead Solaris STDIO library
> biting you. It cannot cope with FILE * pointers if the underlying
> fd number opened is over 256 (it uses an unsigned char for the
> fd member of the FILE struct).
> This is why tridge wrote the XFILE library for HEAD (3.0). I
> believe there are replacement stdio libraries for Solaris that
> don't have this problem but - try forcing Samba to link with
> something like "sfio" for example.
> Does anyone know when Solaris will fix this problem ?
More information about the samba-technical