prevent infinite loop with recurse option enabled?
Justin Yackoski
justin at skiingyac.com
Wed Nov 7 20:03:03 GMT 2001
Chris Watt wrote:
> At 09:09 AM 11/6/01 -0500, Justin Yackoski wrote:
>
>> I may be wrong, but it
>>seems to me that the best solution is to checksum/hash the directory
>>contents and name, and I find it hard to believe that two directories
>>would be exactly identical very often...
>>
>
> Do you get the same effect if you actually smbmount the filesystem and do a
> ls -R on the mountpoint?
> I guess you probably would actually. . .
Yes, actually I've determined that whether I use smbmount or
smbclient, I can recurse until I get 40 directories deep in the share.
This is the case no matter how I do the recursion, so I assume 40 is
hardcoded someplace as the max limit.
I noticed on page 1 of the wget manpage, 2nd P under Description:
"Infinite recursion loops are always avoided by hashing the retrieved
data." Is this for some reason not applicable to smbclient, or would
it be useful to add this, or should I be posting this to the samba
developer list??
> Ok, the best solution I can suggest is that you use a contents checksum or
> depth limit
Do you mean in smbclient? So currently thats not possible, you're
suggesting it is useful to add such a feature/option?
> It may take you a fair chunk of time to update the index, but as long as
> you don't write to the same file your PHP script is trying to read from,
> you can do this in parallel with processing queries (if you need to update
> the index during "office hours" when the Windows boxes happen to be turned
> on). If you want to do complex queries (using info other than the filename)
> it might be worth sticking your index in mySQL (or something similar),
> which happens to play nicely with PHP.
I actually am using MySQL at the moment. Indexing 100,000+ files with
a text file (and doing searches!), well, yes that would be bad.
Thanks again for your help
Justin
More information about the samba
mailing list