We had a similar requirement on one project. We ended up tackling it
on two fronts. First, we ended up with a hacked-up configuration.rb
inside of Thinking Sphinx that uses an internal MultiSite object from
our code, which returns a string representing the current site. Here's
part of it:
@configuration.searchd.pid_file = "#{self.app_root}/log/
searchd.#{environment}.#{MultiSite.site}.pid"
@configuration.searchd.log = "#{self.app_root}/log/
searchd.#{MultiSite.site}.log"
@configuration.searchd.query_log = "#{self.app_root}/log/
searchd.query.#{MultiSite.site}.log"
You get the idea: all the configuration stuff is namespaced by site
name. Then we run one indexing daemon per site, so we actually have
completely separate indexes for each of the sites (all of which have
the exact same database schema, but different database instances, an
architecture necessitated by some fairly strict business rules about
what we can physically share between customers). Then, when an
incoming request comes in that needs to use Sphinx for searching, we
play games in the controller code to hook up to the right index:
ThinkingSphinx::Configuration.instance.port = MultiSite.prop
(:searchd_port)
That points TS at an instance of sphinx whose port is keyed off the
MultiSite object, and then it's off to the races to make .search calls
to find data.
Hope that helps, or at least gives you an architecture to start from.
Mike