No IndexId active

9 views
Skip to first unread message

Pete

unread,
Nov 3, 2009, 12:07:25 PM11/3/09
to hounder
I installed hounder on two seperate linux boxes, one Fedora the other
Suse. I get the same issue each time.

1. Inside of hounder home , it says that the searcher is in the BAD
state. However from the terminal is shows all the services as
running.

2. When I use the web interface to search, any thing I put in returns
with a value of "No indexId active"

Please help I am desperate to get hounder running.

Thanks

Jorge Handl

unread,
Nov 3, 2009, 12:18:50 PM11/3/09
to hou...@googlegroups.com
Pete, that message indicates that the searcher has no index to search on. In order to help, we need to know more about your setup:

- Which components did you install (crawler, indexer, searcher)?
- Where did you install them?
- How are they configured?
- Did you try the 5 minute tutorial?

Pete Sanders

unread,
Nov 3, 2009, 12:43:37 PM11/3/09
to hou...@googlegroups.com
Jorge,

I followed the 5 min tutorial exactly.  Installed all the components, the only think I could of done wrong is my external host name but does that matter?

I installed them under my user home directory in a folder called hounder, and I just tried the default as well.  I am running the newest fedora, any ideas ?

Misc info:
All hounder services say they are running
only search says bad.
when I install it says log file already exists before it loads the GUI install.

Any ideas ?

Jorge Handl

unread,
Nov 3, 2009, 1:00:42 PM11/3/09
to hou...@googlegroups.com
Pete,

I'd trace the problem back from the searcher to the crawler:

1. Verify that the searcher has no index by listing the contents of searcher/indexes. From the error message, I presume it is empty.

2. Check if the indexer produced an index by listing the contents of indexer/indexes/index. If it exists and contains files, there is a problem in the communication between the indexer and the searcher. In that case, the first thing to try is "ssh localhost", which should not ask for a password.

3. If the indexer had no index, check that the crawler did crawl something by running the following command:

      java -cp crawler/conf:lib/hounder-trunk.jar:lib/hounder-trunk-deps.jar com.flaptor.hounder.crawler.pagedb.PageDB stats crawler/pagedb

The line that starts with "stat" will tell you how many pages where fetched and how many have failed.

Please let me know what you find out.

- Jorge

Pete Sanders

unread,
Nov 3, 2009, 1:03:25 PM11/3/09
to hou...@googlegroups.com
Jorge,

When I do ssh local host from terminal, I get connection refused.   The firewall is off , do I need to install something else ?

Thanks

Jorge Handl

unread,
Nov 3, 2009, 1:07:41 PM11/3/09
to hou...@googlegroups.com
Do you have a pair of keys installed in your ~/.ssh directory? If not, follow the link suggested in the 5 minute tutorial (http://www.ece.uci.edu/~chou/ssh-key.html).
- Jorge
Reply all
Reply to author
Forward
0 new messages