SV: [jcifs] BFS vs DFS

Christopher R. Hertel crh at
Fri Jul 27 02:08:46 EST 2001


I wonder what is taking so long.  It sounds as though some particular
operation is just sitting there doing nothing while it waits for a
timeout.  Multithreading the crawler might help.

Chris -)-----

On Thu, Jul 26, 2001 at 10:52:43AM +0200, Torgny Johansson wrote:
> What is the best thing to do then?
> I've written a crawler that just lists all the computers in the workgroups
> from top to bottom (currently not threaded) and it takes very long time to
> do a full "crawl". About 11 hours for 430 pcs (far from every pc have
> shares) and that seems all too long. My code probably (read most
> certainly...) not optimized, so briefly; which is the way to go to create an
> efficient crawler?
> Thanks
> Torgny Johansson
> -----Ursprungligt meddelande-----
> Från: jcifs-admin at
> [mailto:jcifs-admin at]För Allen, Michael B (RSCH)
> Skickat: den 26 juli 2001 03:29
> Till: 'jcifs at'
> Ämne: [jcifs] BFS vs DFS
> I wrote:
> > try to minimize the size of your active list of URLs to
> > search and therefore the number of URLs that might suddenly become invalid
> > by using a Breath First Search algorithm.
> This is not true. BFS would be awfull for an SmbCrawler.
> Mike

Samba Team --     -)-----   Christopher R. Hertel
jCIFS Team --   -)-----   ubiqx development, uninq.
ubiqx Team --     -)-----   crh at

More information about the jcifs mailing list