mvn clean install -DskipTests -Dhbase.api=0.98> email to hbase-indexer-user+unsub...@googlegroups.com.
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:2.0.2:compile (default-compile) on project hbase-indexer-engine: Compilation failure: Compilation failure:
[ERROR] /home/sofia/hbase-indexer/hbase-indexer-engine/src/main/java/com/ngdata/hbaseindexer/indexer/SolrServerFactory.java:[43,15] error: incompatible types
[ERROR]
[ERROR] could not parse error message: required: SolrServer
[ERROR] found: CloudSolrServer
[ERROR] /home/sofia/hbase-indexer/hbase-indexer-engine/src/main/java/com/ngdata/hbaseindexer/indexer/SolrServerFactory.java:49: error: no suitable method found for add(HttpSolrServer)
[ERROR] result.add(new HttpSolrServer(shard, httpClient));
[ERROR] ^>> > email to hbase-indexer-user+unsub...@googlegroups.com.
>> > For more options, visit https://groups.google.com/d/optout.
>
> --
> You received this message because you are subscribed to the Google Groups
> "HBase Indexer Users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to hbase-indexer-user+unsub...@googlegroups.com.
Gabriel,
Will hbase-indexer be updated anytime soon to account for recent Solr API changes?
got event WatchedEvent state:SyncConnected type:None path:null path:null type:None
2016-03-16 10:31:59,030 INFO org.apache.solr.common.cloud.ConnectionManager: Client is connected to ZooKeeper
2016-03-16 10:31:59,030 INFO org.apache.solr.common.cloud.SolrZkClient: Using default ZkACLProvider
2016-03-16 10:31:59,032 INFO org.apache.solr.common.cloud.ZkStateReader: Updating cluster state from ZooKeeper...
2016-03-16 10:31:59,040 ERROR com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter: Error updating Solr
org.apache.solr.common.SolrException: Could not find collection : hpfcollection
at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:162)
at org.apache.solr.client.solrj.impl.CloudSolrServer.directUpdate(CloudSolrServer.java:305)
at org.apache.solr.client.solrj.impl.CloudSolrServer.request(CloudSolrServer.java:539)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:116)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:102)
at com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter.retryAddsIndividually(DirectSolrInputDocumentWriter.java:123)
at com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter.add(DirectSolrInputDocumentWriter.java:108)
at com.ngdata.hbaseindexer.indexer.Indexer.indexRowData(Indexer.java:156)
at com.ngdata.hbaseindexer.indexer.IndexingEventListener.processEvents(IndexingEventListener.java:99)
at com.ngdata.sep.impl.SepEventExecutor$1.run(SepEventExecutor.java:97)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)


It almost seems like an issue with Solrj library, any thoughts on that?I am suspicious if Solrj4.4 dealt with solrcloud in the same way as Solrj5.x.I tried this method from the indexer code locally with few modifications and it works fine with Solr 5.2.0.I was able to successfully add documents to Solr server.private static void createSolrServers() throws SolrServerException, IOException {
String solrMode = "cloud";
if (solrMode.equals("cloud")) {
String indexZkHost = "node1:2181,node2:2181,node3:2181/solr";
String collectionName = "hpfcollection";
CloudSolrServer solrServer = new CloudSolrServer(indexZkHost);
int zkSessionTimeout = 3000;
solrServer.setZkClientTimeout(zkSessionTimeout);
solrServer.setZkConnectTimeout(zkSessionTimeout);
solrServer.setDefaultCollection(collectionName);
SolrInputDocument doc = new SolrInputDocument();
doc.addField("id", "12");
doc.addField("content", "this is added programatically 1233");
solrServer.add(doc, 10);
Set<CloudSolrServer> server = Collections.singleton(solrServer);
} else if (solrMode.equals("classic")) {
/*
PoolingClientConnectionManager connectionManager = new PoolingClientConnectionManager();
connectionManager.setDefaultMaxPerRoute(getSolrMaxConnectionsPerRoute(indexConnectionParams));
connectionManager.setMaxTotal(getSolrMaxConnectionsTotal(indexConnectionParams));
HttpClient httpClient = new DefaultHttpClient(connectionManager);
return new HashSet<SolrServer>(createHttpSolrServers(indexConnectionParams, httpClient));
*/
} else {
throw new RuntimeException("Only 'cloud' and 'classic' are valid values for solr.mode, but got " + solrMode);
}
}
On Fri, Apr 8, 2016 at 8:58 AM, Gabriel Reid <gabrie...@gmail.com> wrote:
Hi Ravi,No, there wouldn't be anything related to hbase-indexer in the solr cloud dump.However, the fact that the chroot (i.e. "/solr" suffix) on your zookeeper connection parameter is a likely cause of the problem here. The ZK nodes should indeed include the /solr suffix (and optionally the port number), as shown here: https://github.com/NGDATA/hbase-indexer/wiki/CLI-tools#add-indexer- Gabriel
hi Gabriel,Thank you for your help.I tried all combinations of zkHost setting with/without ports and with/without /solr, without any luck :(Also compiled sourcecode (https://github.com/NGDATA/hbase-indexer) and deploy the jars but I run into different set of issues.
Hope I am not missing something obvious, what else do you think the issue could be.2016-04-11 23:18:06,382 INFO org.kitesdk.morphline.api.MorphlineContext: Importing commands
2016-04-11 23:18:07,081 INFO org.kitesdk.morphline.api.MorphlineContext: Done importing commands
2016-04-11 23:18:07,084 ERROR com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter: Error updating Solr
org.apache.solr.common.SolrException: Could not find collection : hpfcollection
at org.apache.solr.common.cloud.ClusterState.getCollection(ClusterState.java:162)
at org.apache.solr.client.solrj.impl.CloudSolrServer.directUpdate(CloudSolrServer.java:305)
at org.apache.solr.client.solrj.impl.CloudSolrServer.request(CloudSolrServer.java:539)
at org.apache.solr.client.solrj.request.AbstractUpdateRequest.process(AbstractUpdateRequest.java:124)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:116)
at org.apache.solr.client.solrj.SolrServer.add(SolrServer.java:102)
at com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter.retryAddsIndividually(DirectSolrInputDocumentWriter.java:123)
at com.ngdata.hbaseindexer.indexer.DirectSolrInputDocumentWriter.add(DirectSolrInputDocumentWriter.java:108)
at com.ngdata.hbaseindexer.indexer.Indexer.indexRowData(Indexer.java:156)
at com.ngdata.hbaseindexer.indexer.IndexingEventListener.processEvents(IndexingEventListener.java:99)
at com.ngdata.sep.impl.SepEventExecutor$1.run(SepEventExecutor.java:97)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
at java.util.concurrent.FutureTask.run(FutureTask.java:262)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
at java.lang.Thread.run(Thread.java:745)
Ravi
On Fri, Apr 8, 2016 at 8:58 AM, Gabriel Reid <gabrie...@gmail.com> wrote:
Hi Ravi,
No, there wouldn't be anything related to hbase-indexer in the solr cloud dump.However, the fact that the chroot (i.e. "/solr" suffix) on your zookeeper connection parameter is a likely cause of the problem here. The ZK nodes should indeed include the /solr suffix (and optionally the port number), as shown here: https://github.com/NGDATA/hbase-indexer/wiki/CLI-tools#add-indexer- Gabriel