How to automatically update SKOSMOS dumps?

16 views
Skip to first unread message

Felix Ernst

unread,
Nov 11, 2025, 2:52:21 AMNov 11
to Skosmos Users

Hi everyone,

 

I have a workflow where I use a SKOS vocabulary editor which writes every change directly to the triple store (FUSEKI), so they are shown in SKOSMOS right away. But this means that the dump that is offered in SKOSMOS is outdated on every change. Is there another option how to offer the dump? Or maybe even use the API to download the full vocabulary so I don’t have to update it on every change? I didn’t find how to query a whole vocabulary in the API documentation, only how to download single terms which is also offered in SKOSMOS.


Best
Felix


Osma Suominen

unread,
Nov 11, 2025, 3:54:43 AMNov 11
to Skosmos Users
Hi Felix!

Fuseki supports the SPARQL 1.1 HTTP Graph Store protocol which among other things allows you to download dataset graphs in any RDF serialization. You can use this to get a dump of your data set. For example if you have Fuseki running on localhost:3030 with a dataset "skosmos" and your vocabulary in the graph "http://example.org", you could download it to a file called "example.ttl" with this curl command:

curl -X GET -H "Accept: text/turtle" "http://localhost:3030/skosmos/data?graph=http%3A%2F%2Fexample.org" -o example.ttl

You could use a cronjob to run this, say, once per hour, to produce the dump that is offered for download. That way your dumps would always be relatively fresh.

Another option would be to point the `void:dataDump` configuration directly to the Fuseki HTTP Graph Store URL, allowing Fuseki to serve a fresh dump on every request. But you would then have to expose Fuseki (at least this API) to the internet, which could cause security concerns. Also, producing a dump is quite a heavy operation, so this could cause a lot of load on Fuseki.

Best,
Osma
Reply all
Reply to author
Forward
0 new messages