[jcifs] listFiles on a large directory runs out of memory

Jake Goulding goulding at vivisimo.com
Mon Jun 11 14:02:30 GMT 2007


I have a directory with 350k files, and doing a listFiles() on it causes 
my VM to run out of memory (128 MB). Dumping the heap and running jhat 
yields:

String - 682810 - 13656200 bytes
URL - 169916 - 14952608 bytes
SmbFile - 169767 - 24955749 bytes
Character[] - 513042 - 50242964 bytes

SmbFile has references to 3 Strings, plus the URL, and I'm sure that URL 
has a few Strings, and the Strings all resolve down to the Character[].

Is there some way of processing directories in batches? Or some way of 
having a callback, where each file listed is created, then passed back, 
and could be destroyed after it is done processing. Thanks!





More information about the jcifs mailing list