nopower at suse.de
Fri Feb 17 16:28:29 UTC 2023
On 17/02/2023 10:56, Kees van Vloten via samba wrote:
> Hi Noel,
> As we discussed on the list, I busy getting the bits and pieces in
> place to be able to test your windows search work.
> I am running all my stuff on Debian Bullseye, with Samba code in 3
> lxc-containers. Two are for the DCs, one is the fileserver. Everything
> is managed by Ansible code, I try to avoid manual changes to my
> environment(s) as much as possible. Generally manual changes are to
> test something and when it works I put it in code.
> The fileserver-container uses a mounted host directory to store the
> file shares.
> In the mean time I have installed Opensearch on the host. Opensearch
> is binary compatible with Elasticsearch 7.x and that is what FSCrawler
> requires, so should just work.
as mentioned before, haven't used opensearch so I have no idea what, or
if there are any differences with elasticsearch or indeed how any such
differences might affect the WSP server implementation.
> Now I was looking at FSCrawler and I noticed the last release with
> compiled code is 2.7, is that version alright? I guess we do not want
> to run into issues in FSCrawler while working on Samba, hence 2.10
> snapshots feel like a bad idea.
> /Did you find 2.9 binaries somewhere or how do you deal with this?/
so, the version of fscrawler I last tested with is
> For communication between Samba and Opensearch I will apply the code
> patch from Awen Saunders (Authorization header in smb.conf). Although
> not secure it is good enough for this testing.
sure, I just use anonymous :-) (but you need to do some configuration
steps that I don't remember to get that to work with tls)
> As for Samba I have setup build code to create debian packages from
> https://salsa.debian.org/samba-team/samba. That delivers me .deb
> packages which can be installed with my automation on the fileserver.
hmmm, I am currently trying to get this the server to work/build against
master, I'd be willing to backport it to 4.17 or 4.18 but don't see much
value in backporting it further back just for testing
> I guess that is not what we want for this project. My latest idea is
> to deploy a second fileserver container, specifically for this work.
> /How do you build and install samba for dev and test work?/
I just build and run directly from the source tree, if you know the
correct options (e.g. for debian? ) to pass to configure, then in the
container you could probably easily just do 'make install' and it should
overwrite whatever version is there. The distro config options would
probably need to be changed a little so that you build and use the
correct versions of ldb/talloc/tevent etc. instead of using the system ones
> Since FSCrawler will index the shares, I was thinking of installing it
> on the host and not in the fileserver-container. That helps when there
> are 2 fileserver-containers that work on the same underlying host
> storage (one for this work and the regular one). That reduces
> duplication and keeps the containers lean.
> /Would there be any objections with this approach?/
whenever I tested this, it was with a very simple setup, I just run
elasticsearch on my local dev machine, I point fscrawler at a particular
share I use for testing to create the index and that's it. After that I
use my own 'wspsearch' client or connect a windows client and perform
searches from the file explorer/browser
I think whatever way you want to configure your system is entirely up
to you :-), at the end of the day the wsp server only needs to be able
to talk to the elasticsearch instance to query it. In fact I think I
only ever ran fscrawler once :-)
> A lot of progress on my side but not yet ready :-)
> - Kees.
I hope to have a working (well working for simple searches at least) wsp
early next week
More information about the samba