Hi
Just to add some extra information to Jeannine’s question I’ve had a look at our data:
When creating a new version a new directory is created in graphdb/data/repositories/<PROJECT NAME>-<VERSION> in the GraphDB instance. A corresponding directory is also created on the VocBench side in vocbench/SemanticTurkeyData/projects/<PROJECT NAME>/repositories/<VERSION>. However the new version does not appear in the UI in the list of versions. Is there anything in these directories/files that would prevent them appearing?
Thanks
Martin
From: 'Jeannine Beeken' via vocbench-user <vocben...@googlegroups.com>
Sent: 02 June 2025 10:51
To: vocbench-user <vocben...@googlegroups.com>
Subject: [vocbench-user] Re: No registration/listing of dump of project
CAUTION: This email originated from outside our organisation. Do not click links or open attachments unless you recognise the sender and know the content is safe. If you are not sure it is safe, please contact the IT Helpdesk. |
--
You received this message because you are subscribed to the Google Groups "vocbench-user" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
vocbench-use...@googlegroups.com.
To view this discussion visit
https://groups.google.com/d/msgid/vocbench-user/65a93255-bd2d-400b-b575-2bccad03e4f0n%40googlegroups.com.
Hi Tiziano
I can confirm that the newly created GraphDB directory does contain both a config.ttl and a storage directory (although if I roughly compare the contents of the directory there are major size differences in some of the files between the new version and the ‘_core’ directory – which I assume is the original? e.g. entities, entities.datatypes etc)
We are using version 10.6.2 running in an AWS Fargate container talking to a VocBench instance also running as a Fargate service
It seems to happen with all projects
There are no special characters only underscores and hyphens
I did look for errors in graphdb/logs/error.log and found a few instances like this (although the timings are a good 10 minutes or so before the new version directory was created):
[ERROR] 2025-06-02 09:16:51,768 [http-nio-7200-exec-9 | o.a.c.c.C.[.[.[.[openrdf-http-server]] Servlet.service() for servlet [openrdf-http-server] in context with path [] threw exception [Request processing failed; nested exception is org.eclipse.rdf4j.http.server.ServerHTTPException: java.util.concurrent.ExecutionException: org.eclipse.rdf4j.rio.RDFHandlerException: org.apache.catalina.connector.ClientAbortException: java.net.SocketTimeoutException] with root cause
java.net.SocketTimeoutException: null
at org.apache.tomcat.util.net.NioEndpoint$NioSocketWrapper.doWrite(NioEndpoint.java:1426)
at org.apache.tomcat.util.net.SocketWrapperBase.doWrite(SocketWrapperBase.java:775)
at org.apache.tomcat.util.net.SocketWrapperBase.writeBlocking(SocketWrapperBase.java:600)
at org.apache.tomcat.util.net.SocketWrapperBase.write(SocketWrapperBase.java:544)
at org.apache.coyote.http11.Http11OutputBuffer$SocketOutputBuffer.doWrite(Http11OutputBuffer.java:540)
at org.apache.coyote.http11.filters.ChunkedOutputFilter.doWrite(ChunkedOutputFilter.java:112)
at org.apache.coyote.http11.Http11OutputBuffer.doWrite(Http11OutputBuffer.java:193)
at org.apache.coyote.Response.doWrite(Response.java:606)
at org.apache.catalina.connector.OutputBuffer.realWriteBytes(OutputBuffer.java:335)
at org.apache.catalina.connector.OutputBuffer.flushByteBuffer(OutputBuffer.java:777)
at org.apache.catalina.connector.OutputBuffer.append(OutputBuffer.java:680)
at org.apache.catalina.connector.OutputBuffer.writeBytes(OutputBuffer.java:383)
at org.apache.catalina.connector.OutputBuffer.write(OutputBuffer.java:361)
at org.apache.catalina.connector.CoyoteOutputStream.write(CoyoteOutputStream.java:97)
at com.github.ziplet.filter.compression.ThresholdOutputStream.write(ThresholdOutputStream.java:92)
at com.github.ziplet.filter.compression.CompressingServletOutputStream.write(CompressingServletOutputStream.java:66)
at java.base/java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:81)
at java.base/java.io.BufferedOutputStream.write(BufferedOutputStream.java:127)
at java.base/java.io.DataOutputStream.write(DataOutputStream.java:112)
at java.base/java.io.FilterOutputStream.write(FilterOutputStream.java:108)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.writeString(BinaryRDFWriter.java:346)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.writeLiteral(BinaryRDFWriter.java:318)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.writeValue(BinaryRDFWriter.java:293)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.writeValueOrId(BinaryRDFWriter.java:270)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.writeStatement(BinaryRDFWriter.java:223)
at org.eclipse.rdf4j.rio.binary.BinaryRDFWriter.consumeStatement(BinaryRDFWriter.java:208)
at org.eclipse.rdf4j.rio.helpers.AbstractRDFWriter.handleStatement(AbstractRDFWriter.java:109)
at org.eclipse.rdf4j.repository.sail.SailRepositoryConnection.exportStatements(SailRepositoryConnection.java:390)
at org.eclipse.rdf4j.http.server.repository.transaction.Transaction.lambda$exportStatements$6(Transaction.java:244)
at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)
at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)
at java.base/java.lang.Thread.run(Thread.java:840)
To view this discussion visit https://groups.google.com/d/msgid/vocbench-user/67861f4e-48bf-4093-bb19-ca350c6dcfcfn%40googlegroups.com.