Gerrit 3.12.0 Reindex after Update from 3.11.3 failed

308 views
Skip to first unread message

Dennis W

unread,
May 20, 2025, 11:06:24 AM5/20/25
to Repo and Gerrit Discussion
Hello everyone,
I updated our testserver from 3.11.3 to 3.12.0 as an offline upgrade[1]
and was met with a lot of exceptions during reindexing:

Reindexing changes: Slicing projects: 100% (384/384) (-)[2025-05-20 09:48:35,646] [Index-Batch-3[Index all changes of project 0000_meta-dev][0000_meta-dev-0]] WARN  com.google.gerrit.server.cache.h2.H2CacheImpl : Cannot read cache jdbc:h2:file:///usr/local/gerrit/cache/gerrit_file_diff-v2 for FileDiffCacheKey{project=0000_meta-dev, oldCommit=commit c7a3960bb1aa8bb7ccadc5739a02ff95d557a25c 0 -------, newCommit=AnyObjectId[48b598d9d6015adcf2aef12b3f09a6f1c029d6d9], newFilePath=/COMMIT_MSG, renameScore=60, diffAlgorithm=HISTOGRAM_WITH_FALLBACK_MYERS, whitespace=IGNORE_NONE, useTimeout=true}
org.h2.jdbc.JdbcBatchUpdateException: Allgemeiner Fehler: "java.lang.RuntimeException: object already exists: VERSION_KEY"
General error: "java.lang.RuntimeException: object already exists: VERSION_KEY"; SQL statement:
CREATE INDEX IF NOT EXISTS version_key ON data(version, k) [50000-232]
        at org.h2.jdbc.JdbcStatement.executeBatch(JdbcStatement.java:828)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlHandle.<init>(H2CacheImpl.java:749)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.acquire(H2CacheImpl.java:704)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.getIfPresent(H2CacheImpl.java:456)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$Loader.loadAll(H2CacheImpl.java:276)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache$BulkLoader.loadAll(CaffeinatedGuavaLoadingCache.java:179)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newBulkMappingFunction$3(LocalLoadingCache.java:162)
        at com.github.benmanes.caffeine.cache.LocalManualCache.bulkLoad(LocalManualCache.java:102)
        at com.github.benmanes.caffeine.cache.LocalManualCache.getAll(LocalManualCache.java:89)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.getAll(LocalLoadingCache.java:62)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache.getAll(CaffeinatedGuavaLoadingCache.java:91)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.getAll(H2CacheImpl.java:144)
        at com.google.gerrit.server.patch.filediff.FileDiffCacheImpl.getAll(FileDiffCacheImpl.java:131)
        at com.google.gerrit.server.patch.DiffOperationsImpl.getModifiedFilesForKeys(DiffOperationsImpl.java:319)
        at com.google.gerrit.server.patch.DiffOperationsImpl.getModifiedFiles(DiffOperationsImpl.java:297)
        at com.google.gerrit.server.patch.DiffOperationsImpl.listModifiedFilesAgainstParent(DiffOperationsImpl.java:130)
        at com.google.gerrit.server.patch.DiffSummaryLoader.call(DiffSummaryLoader.java:51)
        at com.google.gerrit.server.patch.DiffSummaryLoader.call(DiffSummaryLoader.java:28)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.lambda$get$1(H2CacheImpl.java:163)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaCache.lambda$get$0(CaffeinatedGuavaCache.java:69)
        at com.github.benmanes.caffeine.cache.LocalCache.lambda$statsAware$0(LocalCache.java:139)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2413)
        at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2411)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2394)
        at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
        at com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaCache.get(CaffeinatedGuavaCache.java:67)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.get(H2CacheImpl.java:158)
        at com.google.gerrit.server.patch.PatchListCacheImpl.getDiffSummary(PatchListCacheImpl.java:101)
        at com.google.gerrit.server.query.change.ChangeData.getDiffSummary(ChangeData.java:601)
        at com.google.gerrit.server.query.change.ChangeData.computeChangedLines(ChangeData.java:610)
        at com.google.gerrit.server.query.change.ChangeData.changedLines(ChangeData.java:622)
        at com.google.gerrit.server.index.change.ChangeField.lambda$static$56(ChangeField.java:1207)
        at com.google.gerrit.index.IndexedField.get(IndexedField.java:437)
        at com.google.gerrit.index.IndexedField$SearchSpec.get(IndexedField.java:151)
        at com.google.gerrit.index.Schema.fieldValues(Schema.java:273)
        at com.google.gerrit.index.Schema.lambda$buildFields$0(Schema.java:306)
        at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
        at com.google.common.collect.CollectSpliterators$1WithCharacteristics.lambda$forEachRemaining$1(CollectSpliterators.java:72)
        at java.base/java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:104)
        at com.google.common.collect.CollectSpliterators$1WithCharacteristics.forEachRemaining(CollectSpliterators.java:72)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
        at com.google.gerrit.index.Schema.buildFields(Schema.java:308)
        at com.google.gerrit.lucene.AbstractLuceneIndex.toDocument(AbstractLuceneIndex.java:359)
        at com.google.gerrit.lucene.LuceneChangeIndex.insert(LuceneChangeIndex.java:235)
        at com.google.gerrit.lucene.LuceneChangeIndex.insert(LuceneChangeIndex.java:105)
        at com.google.gerrit.server.index.change.ChangeIndexer.indexImpl(ChangeIndexer.java:337)
        at com.google.gerrit.server.index.change.ChangeIndexer.doIndex(ChangeIndexer.java:297)
        at com.google.gerrit.server.index.change.ChangeIndexer.index(ChangeIndexer.java:293)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.lambda$new$1(AllChangesIndexer.java:286)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.index(AllChangesIndexer.java:327)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.lambda$call$3(AllChangesIndexer.java:313)
        at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
        at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179)
        at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
        at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179)
        at java.base/java.util.Spliterators$ArraySpliterator.forEachRemaining(Spliterators.java:1024)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
        at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
        at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.call(AllChangesIndexer.java:313)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.call(AllChangesIndexer.java:266)
        at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
        at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:75)
        at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
        at com.google.gerrit.server.logging.LoggingContextAwareRunnable.run(LoggingContextAwareRunnable.java:95)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at com.google.gerrit.server.git.WorkQueue$Task.run(WorkQueue.java:912)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
        at java.base/java.lang.Thread.run(Thread.java:1583)

org.h2.jdbc.JdbcSQLTimeoutException: Zeitüberschreitung beim Versuch die Tabelle null zu sperren
Timeout trying to lock table null; SQL statement:
CREATE INDEX IF NOT EXISTS accessed ON data(accessed) [50200-232]
[2025-05-20 09:48:35,669] [Index-Batch-2[Index all changes of project 0000_meta][0000_meta-0]] WARN  com.google.gerrit.server.cache.h2.H2CacheImpl : Cannot read cache jdbc:h2:file:///usr/local/gerrit/cache/git_file_diff-v2 for GitFileDiffCacheKey{project=0000_meta, oldTree=tree 2faf1180a9155e48e60ed225720a7408c49ffc8b -------, newTree=tree 1ff82996446c6bc64673d273238db4f53f2d7aec -------, newFilePath=project.config, renameScore=60, diffAlgorithm=HISTOGRAM_WITH_FALLBACK_MYERS, whitespace=IGNORE_NONE, useTimeout=true}
org.h2.jdbc.JdbcBatchUpdateException: Tabelle "DATA" nicht gefunden (mögliche Kandidaten: "DATA")
Table "DATA" not found (candidates are: "DATA"); SQL statement:
INSERT INTO "PUBLIC"."DATA_COPY_4_1"("K", "V", "CREATED", "ACCESSED") OVERRIDING SYSTEM VALUE SELECT "K", "V", "CREATED", "ACCESSED" FROM "PUBLIC"."DATA" [42103-232]
        at org.h2.jdbc.JdbcStatement.executeBatch(JdbcStatement.java:828)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlHandle.<init>(H2CacheImpl.java:749)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.acquire(H2CacheImpl.java:704)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.getIfPresent(H2CacheImpl.java:456)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$Loader.loadAll(H2CacheImpl.java:276)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache$BulkLoader.loadAll(CaffeinatedGuavaLoadingCache.java:179)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newBulkMappingFunction$3(LocalLoadingCache.java:162)
        at com.github.benmanes.caffeine.cache.LocalManualCache.bulkLoad(LocalManualCache.java:102)
        at com.github.benmanes.caffeine.cache.LocalManualCache.getAll(LocalManualCache.java:89)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.getAll(LocalLoadingCache.java:62)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache.getAll(CaffeinatedGuavaLoadingCache.java:91)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.getAll(H2CacheImpl.java:144)
        at com.google.gerrit.server.patch.gitfilediff.GitFileDiffCacheImpl.getAll(GitFileDiffCacheImpl.java:153)
        at com.google.gerrit.server.patch.filediff.AllDiffsEvaluator.computeGitFileDiffs(AllDiffsEvaluator.java:156)
        at com.google.gerrit.server.patch.filediff.AllDiffsEvaluator.execute(AllDiffsEvaluator.java:79)
        at com.google.gerrit.server.patch.filediff.FileDiffCacheImpl$FileDiffLoader.createFileEntries(FileDiffCacheImpl.java:387)
        at com.google.gerrit.server.patch.filediff.FileDiffCacheImpl$FileDiffLoader.loadAll(FileDiffCacheImpl.java:193)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$Loader.loadAll(H2CacheImpl.java:284)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache$BulkLoader.loadAll(CaffeinatedGuavaLoadingCache.java:179)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newBulkMappingFunction$3(LocalLoadingCache.java:162)
        at com.github.benmanes.caffeine.cache.LocalManualCache.bulkLoad(LocalManualCache.java:102)
        at com.github.benmanes.caffeine.cache.LocalManualCache.getAll(LocalManualCache.java:89)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.getAll(LocalLoadingCache.java:62)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache.getAll(CaffeinatedGuavaLoadingCache.java:91)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.getAll(H2CacheImpl.java:144)
        at com.google.gerrit.server.patch.filediff.FileDiffCacheImpl.getAll(FileDiffCacheImpl.java:131)
        at com.google.gerrit.server.patch.DiffOperationsImpl.getModifiedFilesForKeys(DiffOperationsImpl.java:319)
        at com.google.gerrit.server.patch.DiffOperationsImpl.getModifiedFiles(DiffOperationsImpl.java:297)
        at com.google.gerrit.server.patch.DiffOperationsImpl.listModifiedFilesAgainstParent(DiffOperationsImpl.java:130)
        at com.google.gerrit.server.patch.DiffSummaryLoader.call(DiffSummaryLoader.java:51)
        at com.google.gerrit.server.patch.DiffSummaryLoader.call(DiffSummaryLoader.java:28)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.lambda$get$1(H2CacheImpl.java:163)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaCache.lambda$get$0(CaffeinatedGuavaCache.java:69)
        at com.github.benmanes.caffeine.cache.LocalCache.lambda$statsAware$0(LocalCache.java:139)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2413)
        at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2411)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2394)
        at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
        at com.github.benmanes.caffeine.cache.LocalManualCache.get(LocalManualCache.java:62)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaCache.get(CaffeinatedGuavaCache.java:67)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.get(H2CacheImpl.java:158)
        at com.google.gerrit.server.patch.PatchListCacheImpl.getDiffSummary(PatchListCacheImpl.java:101)
        at com.google.gerrit.server.query.change.ChangeData.getDiffSummary(ChangeData.java:601)
        at com.google.gerrit.server.query.change.ChangeData.computeChangedLines(ChangeData.java:610)
        at com.google.gerrit.server.query.change.ChangeData.changedLines(ChangeData.java:622)
        at com.google.gerrit.server.index.change.ChangeField.lambda$static$56(ChangeField.java:1207)
        at com.google.gerrit.index.IndexedField.get(IndexedField.java:437)
        at com.google.gerrit.index.IndexedField$SearchSpec.get(IndexedField.java:151)
        at com.google.gerrit.index.Schema.fieldValues(Schema.java:273)
        at com.google.gerrit.index.Schema.lambda$buildFields$0(Schema.java:306)
        at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
        at com.google.common.collect.CollectSpliterators$1WithCharacteristics.lambda$forEachRemaining$1(CollectSpliterators.java:72)
        at java.base/java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:104)
        at com.google.common.collect.CollectSpliterators$1WithCharacteristics.forEachRemaining(CollectSpliterators.java:72)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
        at java.base/java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:921)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:682)
        at com.google.gerrit.index.Schema.buildFields(Schema.java:308)
        at com.google.gerrit.lucene.AbstractLuceneIndex.toDocument(AbstractLuceneIndex.java:359)
        at com.google.gerrit.lucene.LuceneChangeIndex.insert(LuceneChangeIndex.java:235)
        at com.google.gerrit.lucene.LuceneChangeIndex.insert(LuceneChangeIndex.java:105)
        at com.google.gerrit.server.index.change.ChangeIndexer.indexImpl(ChangeIndexer.java:337)
        at com.google.gerrit.server.index.change.ChangeIndexer.doIndex(ChangeIndexer.java:297)
        at com.google.gerrit.server.index.change.ChangeIndexer.index(ChangeIndexer.java:293)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.lambda$new$1(AllChangesIndexer.java:286)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.index(AllChangesIndexer.java:327)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.lambda$call$3(AllChangesIndexer.java:313)
        at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.accept(ForEachOps.java:184)
        at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179)
        at java.base/java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:197)
        at java.base/java.util.stream.ReferencePipeline$2$1.accept(ReferencePipeline.java:179)
        at java.base/java.util.Iterator.forEachRemaining(Iterator.java:133)
        at java.base/java.util.Spliterators$IteratorSpliterator.forEachRemaining(Spliterators.java:1939)
        at java.base/java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:509)
        at java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:499)
        at java.base/java.util.stream.ForEachOps$ForEachOp.evaluateSequential(ForEachOps.java:151)
        at java.base/java.util.stream.ForEachOps$ForEachOp$OfRef.evaluateSequential(ForEachOps.java:174)
        at java.base/java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)
        at java.base/java.util.stream.ReferencePipeline.forEach(ReferencePipeline.java:596)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.call(AllChangesIndexer.java:313)
        at com.google.gerrit.server.index.change.AllChangesIndexer$ProjectSliceIndexer.call(AllChangesIndexer.java:266)
        at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
        at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:75)
        at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
        at com.google.gerrit.server.logging.LoggingContextAwareRunnable.run(LoggingContextAwareRunnable.java:95)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at com.google.gerrit.server.git.WorkQueue$Task.run(WorkQueue.java:912)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
        at java.base/java.lang.Thread.run(Thread.java:1583)
org.h2.jdbc.JdbcSQLSyntaxErrorException: Tabelle "DATA" nicht gefunden (mögliche Kandidaten: "DATA")
 

On the second attempt of reindexing the exception changes saying the table "DATA" already exists a lot:

org.h2.jdbc.JdbcSQLSyntaxErrorException: Tabelle "DATA" besteht bereits
Table "DATA" already exists; SQL statement:
CREATE CACHED TABLE "PUBLIC"."DATA"(
    "K" JAVA_OBJECT NOT NULL,
    "V" JAVA_OBJECT NOT NULL,
    "CREATED" TIMESTAMP NOT NULL,
    "ACCESSED" TIMESTAMP NOT NULL,
    "SPACE" BIGINT GENERATED ALWAYS AS (OCTET_LENGTH("K") + OCTET_LENGTH("V")),
    "VERSION" INTEGER DEFAULT 0 NOT NULL
) [42101-232]
        at org.h2.message.DbException.getJdbcSQLException(DbException.java:514)
        at org.h2.message.DbException.getJdbcSQLException(DbException.java:489)
        at org.h2.message.DbException.get(DbException.java:223)
        at org.h2.message.DbException.get(DbException.java:199)
        at org.h2.command.ddl.CreateTable.update(CreateTable.java:91)
        at org.h2.engine.MetaRecord.prepareAndExecute(MetaRecord.java:77)
        at org.h2.engine.Database.executeMeta(Database.java:665)
        at org.h2.engine.Database.executeMeta(Database.java:637)
        at org.h2.engine.Database.<init>(Database.java:359)
        at org.h2.engine.Engine.openSession(Engine.java:92)
        at org.h2.engine.Engine.openSession(Engine.java:222)
        at org.h2.engine.Engine.createSession(Engine.java:201)
        at org.h2.engine.SessionRemote.connectEmbeddedOrServer(SessionRemote.java:344)
        at org.h2.jdbc.JdbcConnection.<init>(JdbcConnection.java:124)
        at org.h2.Driver.connect(Driver.java:59)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlHandle.<init>(H2CacheImpl.java:732)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.acquire(H2CacheImpl.java:704)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$SqlStore.getIfPresent(H2CacheImpl.java:456)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$Loader.load(H2CacheImpl.java:260)
        at com.google.gerrit.server.cache.h2.H2CacheImpl$Loader.load(H2CacheImpl.java:244)
        at com.google.gerrit.server.cache.mem.PassthroughLoadingCache.get(PassthroughLoadingCache.java:88)
        at com.google.gerrit.server.cache.h2.H2CacheImpl.get(H2CacheImpl.java:130)
        at com.google.gerrit.server.project.ProjectCacheImpl$InMemoryLoader.load(ProjectCacheImpl.java:420)
        at com.google.gerrit.server.project.ProjectCacheImpl$InMemoryLoader.load(ProjectCacheImpl.java:361)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache$SingleLoader.load(CaffeinatedGuavaLoadingCache.java:136)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.lambda$newMappingFunction$2(LocalLoadingCache.java:141)
        at com.github.benmanes.caffeine.cache.LocalCache.lambda$statsAware$0(LocalCache.java:139)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.lambda$doComputeIfAbsent$14(BoundedLocalCache.java:2413)
        at java.base/java.util.concurrent.ConcurrentHashMap.compute(ConcurrentHashMap.java:1916)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.doComputeIfAbsent(BoundedLocalCache.java:2411)
        at com.github.benmanes.caffeine.cache.BoundedLocalCache.computeIfAbsent(BoundedLocalCache.java:2394)
        at com.github.benmanes.caffeine.cache.LocalCache.computeIfAbsent(LocalCache.java:108)
        at com.github.benmanes.caffeine.cache.LocalLoadingCache.get(LocalLoadingCache.java:54)
        at com.github.benmanes.caffeine.guava.CaffeinatedGuavaLoadingCache.get(CaffeinatedGuavaLoadingCache.java:59)
        at com.google.gerrit.server.project.ProjectCacheImpl.get(ProjectCacheImpl.java:227)
        at com.google.gerrit.server.index.project.AllProjectsIndexer.lambda$reindexProjects$0(AllProjectsIndexer.java:88)
        at com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:131)
        at com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:75)
        at com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:82)
        at com.google.gerrit.server.logging.LoggingContextAwareRunnable.run(LoggingContextAwareRunnable.java:113)
        at java.base/java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:572)
        at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:317)
        at java.base/java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:304)
        at com.google.gerrit.server.git.WorkQueue$Task.run(WorkQueue.java:912)
        at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1144)
        at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:642)
        at java.base/java.lang.Thread.run(Thread.java:1583)
[2025-05-20 10:13:43,160] [Index-Batch-3[com.google.gerrit.server.index.project.AllProjectsIndexer$$Lambda/0x00007f35f467ec78@9808933]] WARN  com.google.gerrit.server.cache.h2.H2CacheImpl : Cannot read cache jdbc:h2:file:///usr/local/gerrit/cache/persisted_projects-v2 for project: " 0000_meta-dev "
revision: "\250\237oj\254J\203\333\262M\235V,\346Z\207#\322\357H"
 
I solved this by clearing the cache and index folder and reindex everything again.
Is this the "correct way" to fix this? 
Am I the ony one who has this issue?

In case this is important: I am planning an Update from v3.8 and tested the process by upgrading a test server to v3.11.3 a few weeks ago. Everything went smoothly and seemed to work fine. And now I thought about testing 3.12.


Thanks,
Dennis


Nasser Grainawi

unread,
May 20, 2025, 11:17:05 AM5/20/25
to Dennis W, Repo and Gerrit Discussion
The H2 version changed and that makes the old cache files incompatible. It's in the notes (https://www.gerritcodereview.com/3.12.html#new-h2-v2-storage-backend-for-persistent-caches) but not mentioned in the upgrade steps. It probably would be good to have it there since I'm sure you won't be the only person to run into this.
 


In case this is important: I am planning an Update from v3.8 and tested the process by upgrading a test server to v3.11.3 a few weeks ago. Everything went smoothly and seemed to work fine. And now I thought about testing 3.12.


Thanks,
Dennis


--
--
To unsubscribe, email repo-discuss...@googlegroups.com
More info at http://groups.google.com/group/repo-discuss?hl=en

---
You received this message because you are subscribed to the Google Groups "Repo and Gerrit Discussion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to repo-discuss...@googlegroups.com.
To view this discussion visit https://groups.google.com/d/msgid/repo-discuss/1bc15cec-ab43-4b59-8dc3-40b93390fe5an%40googlegroups.com.

Luca Milanesio

unread,
May 21, 2025, 3:25:52 AM5/21/25
to Repo and Gerrit Discussion

On 20 May 2025, at 16:16, 'Nasser Grainawi' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:



On Tue, May 20, 2025 at 9:06 AM 'Dennis W' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:
Hello everyone,
I updated our testserver from 3.11.3 to 3.12.0 as an offline upgrade[1]
and was met with a lot of exceptions during reindexing:

I am surprised that it looks for the old files though: the H2 v2 files should have a different name.
It should have started with a fresh H2 v2 cache.

You should migrate the H2 v1 to H2 v2 files before starting any operation on Gerrit v2.12.

Thomas contributed a script [1] for performing the migration easily, however, the [1] wasn’t flagged for inclusion in the release notes, so I’ve missed it :-(
Going to create a change to indicate the step as pre-requisite for the migraiton.

HTH

Luca.



lucamilanesio

unread,
May 21, 2025, 3:41:26 AM5/21/25
to Repo and Gerrit Discussion
On Wednesday, May 21, 2025 at 8:25:52 AM UTC+1 Luca Milanesio wrote:

On 20 May 2025, at 16:16, 'Nasser Grainawi' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:



On Tue, May 20, 2025 at 9:06 AM 'Dennis W' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:
Hello everyone,
I updated our testserver from 3.11.3 to 3.12.0 as an offline upgrade[1]
and was met with a lot of exceptions during reindexing:

I am surprised that it looks for the old files though: the H2 v2 files should have a different name.
It should have started with a fresh H2 v2 cache.

You should migrate the H2 v1 to H2 v2 files before starting any operation on Gerrit v2.12.

Thomas contributed a script [1] for performing the migration easily, however, the [1] wasn’t flagged for inclusion in the release notes, so I’ve missed it :-(
Going to create a change to indicate the step as pre-requisite for the migraiton.

HTH

Luca.




Reindexing changes: Slicing projects: 100% (384/384) (-)[2025-05-20 09:48:35,646] [Index-Batch-3[Index all changes of project 0000_meta-dev][0000_meta-dev-0]] WARN  com.google.gerrit.server.cache.h2.H2CacheImpl : Cannot read cache jdbc:h2:file:///usr/local/gerrit/cache/gerrit_file_diff-v2
If you look at the above message, you're using the H2 v2 files already: jdbc:h2:file:///usr/local/gerrit/cache/gerrit_file_diff-v2
 
Can you paste here the *exact* steps you performed for migrating from v3.11.3 to v3.12.0?
I've just followed the release notes and it works for me: the cache starts empty.

Luca.

Dennis W

unread,
May 22, 2025, 12:53:37 PM5/22/25
to Repo and Gerrit Discussion
Hello,

lucamilanesio schrieb am Mittwoch, 21. Mai 2025 um 09:41:26 UTC+2:
On Wednesday, May 21, 2025 at 8:25:52 AM UTC+1 Luca Milanesio wrote:

On 20 May 2025, at 16:16, 'Nasser Grainawi' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:



On Tue, May 20, 2025 at 9:06 AM 'Dennis W' via Repo and Gerrit Discussion <repo-d...@googlegroups.com> wrote:
Hello everyone,
I updated our testserver from 3.11.3 to 3.12.0 as an offline upgrade[1]
and was met with a lot of exceptions during reindexing:

I am surprised that it looks for the old files though: the H2 v2 files should have a different name.
It should have started with a fresh H2 v2 cache.

You should migrate the H2 v1 to H2 v2 files before starting any operation on Gerrit v2.12.

Thomas contributed a script [1] for performing the migration easily, however, the [1] wasn’t flagged for inclusion in the release notes, so I’ve missed it :-(
Going to create a change to indicate the step as pre-requisite for the migraiton.

HTH

Luca.




Reindexing changes: Slicing projects: 100% (384/384) (-)[2025-05-20 09:48:35,646] [Index-Batch-3[Index all changes of project 0000_meta-dev][0000_meta-dev-0]] WARN  com.google.gerrit.server.cache.h2.H2CacheImpl : Cannot read cache jdbc:h2:file:///usr/local/gerrit/cache/gerrit_file_diff-v2
If you look at the above message, you're using the H2 v2 files already: jdbc:h2:file:///usr/local/gerrit/cache/gerrit_file_diff-v2
 
Can you paste here the *exact* steps you performed for migrating from v3.11.3 to v3.12.0?
I've just followed the release notes and it works for me: the cache starts empty.

Luca.

the steps were pretty much the suggested ones, aside from backing up the indexes as I was lazy (because this is only a test server).
And while I did see that the H2 backend was upgraded in the release notes, I missed the part where it is incompatible and thought the reindex would handle the necessary steps.

Steps for offline upgrade I performed:
  1. Download the new gerrit.war -> https://gerrit-releases.storage.googleapis.com/gerrit-3.12.0.war
  2. Stop Gerrit
    -> we have installed as a service -> sudo systemctl stop gerrit
  3. Ensure all installed plugins are compatible with the new API -> the only non-core-plugins we use are:
    * rename-project
      -> no new version since v3.6.3
    * login-redirect
      -> downloaded the master-version as there is no specific one for v3.12
  4. Run init: java -jar gerrit-3.12.0.war init --batch
    -> as I execute it in the gerrit installation folder, parameter -d was omitted
  5. Reindex all indexes:   java -jar bin/gerrit.war reindex
And that worked from 3.8 to 3.11 without issues.
So I cannot say why the files for v2 already existed, as this happend on the first reindex after upgrading from 3.11.3 to 3.12.0.

Is there any downside to completely rebuilding index and cache or should one rather use the migration script?
Time shouldn't be an issue, as we don't have a very large gerrit instance.

Thanks,
Dennis

Luca Milanesio

unread,
May 22, 2025, 12:59:12 PM5/22/25
to Repo and Gerrit Discussion, Luca Milanesio
That is strange indeed.

Is there any downside to completely rebuilding index and cache or should one rather use the migration script?
Time shouldn't be an issue, as we don't have a very large gerrit instance.

If you Gerrit setup is small, that’s fine.

HTH

Luca.

Reply all
Reply to author
Forward
0 new messages