Hi,
I 'm actually trying to ingest metacards using DDF 2.8.1 and I have the following error :
2016-01-05 12:48:00,023 | ERROR | qtp567592787-199 | SolrCore | apache.solr.common.SolrException 131 | 225 - platform-solr-server-standalone - 2.8.1 | org.apache.solr.common.SolrException: Exception writing document id 037ca53a-e8e6-4e67-910a-fe87e232292c to the index; possible analysis error.
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:168)
at org.apache.solr.update.processor.RunUpdateProcessor.processAdd(RunUpdateProcessorFactory.java:69)
at org.apache.solr.update.processor.UpdateRequestProcessor.processAdd(UpdateRequestProcessor.java:51)
...
Caused by: java.lang.IllegalArgumentException: Document contains at least one immense term in field="metadata_txt" (whose UTF8 encoding is longer than the max length 32766), all of which were skipped. Please correct the analyzer to not produce such terms. The prefix of the first immense term is: '[60, 117, 114, 110, 58, 82, 101, 115, 117, 108, 116, 115, 32, 115, 99, 104, 101, 109, 97, 86, 101, 114, 115, 105, 111, 110, 61, 34, 48, 46]...', original message: bytes can be at most 32766 in length; got 44566
at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:687)
at org.apache.lucene.index.DefaultIndexingChain.processField(DefaultIndexingChain.java:359)
at org.apache.lucene.index.DefaultIndexingChain.processDocument(DefaultIndexingChain.java:318)
at org.apache.lucene.index.DocumentsWriterPerThread.updateDocument(DocumentsWriterPerThread.java:241)
at org.apache.lucene.index.DocumentsWriter.updateDocument(DocumentsWriter.java:465)
at org.apache.lucene.index.IndexWriter.updateDocument(IndexWriter.java:1526)
at org.apache.solr.update.DirectUpdateHandler2.addDoc0(DirectUpdateHandler2.java:240)
at org.apache.solr.update.DirectUpdateHandler2.addDoc(DirectUpdateHandler2.java:164)
... 62 more
Caused by: org.apache.lucene.util.BytesRefHash$MaxBytesLengthExceededException: bytes can be at most 32766 in length; got 44566
at org.apache.lucene.util.BytesRefHash.add(BytesRefHash.java:284)
at org.apache.lucene.index.TermsHashPerField.add(TermsHashPerField.java:151)
at org.apache.lucene.index.DefaultIndexingChain$PerField.invert(DefaultIndexingChain.java:663)
... 69 more
This error doesn't occur with DDF 2.6.1
I have tried many workarounds found on the web but none of them solve this issue.
Do you have something specific to suggest that would solve this issue ?
Thanks in advance
Samuel