Dear Chris/Vangelis,
We are working on our first DynaLearn release in which we do all our
OWL generation/parsing with Thea. We are very happy with it.
We noticed something weird with the test set of a few hundred models.
I was under the assumption that Thea 'consumes' the triples when
loading an OWL file. However, after loading, I can query the rdf graph
with rdf_has. Furthermore, after calling retract_all_axioms, the rdf
graph also remains.
Below a testcase performed with the latest git versions of SWI-Prolog
and Thea (not that I changed 1 line in Thea).
Best regards,
Jochem
--- a/
owl2_io.pl
+++ b/
owl2_io.pl
@@ -151,7 +151,7 @@
convert_axioms(FileIn,FmtIn,FileOut,FmtOut,Opts) :-
% TODO - check if this is the best way of doing this
load_handler(Dir,Fmt) :-
forall(format_module(Dir,Fmt,Mod),
- ensure_loaded(library(thea2/Mod))).
+ ensure_loaded(Mod)).
Test case:
:- use_module(library('semweb/rdf_db')).
:- ensure_loaded('
owl2_model.pl').
:- ensure_loaded('
owl2_from_rdf.pl').
:- ensure_loaded('
owl2_export_rdf.pl').
:- ensure_loaded('
owl2_xml.pl').
:- ensure_loaded('
owl2_util.pl').
:- ensure_loaded('
owl2_io.pl').
:-
absolute_file_name('LS4-CausalDifferentiation.owl', FileName),
not(rdf_has(_A, _B, _C)),
format('No triples in the beginning.\n'),
load_axioms(FileName),
rdf_db:rdf_has(_D, _E, _F),
format('Triples after load_axioms. I thought they were consumed?
\n'),
retract_all_axioms,
rdf_db:rdf_has(_G, _H, _I),
format('After retract_all_axioms, I still have triples! Memory
leak!\n').