Hi Moises,
Thank you for your reports. See my comments below:
I'm also writting to point that your current master installation in
http://apollo.hpclab.ceid.upatras.gr:8000/jspui18/ is throwing memory
full errors when browsing for categories, maybe my queries to your OAI
service have generated such exception? only log file can tell.
Here's a message:
HTTP Status 500 -
type Exception report
message
description The server encountered an internal error () that prevented
it from fulfilling this request.
exception
javax.servlet.ServletException: Servlet execution threw an exception
org.dspace.utils.servlet.DSpaceWebappServletFilter.doFilter(DSpaceWebappServletFilter.java:78)
root cause
java.lang.OutOfMemoryError: PermGen space
note The full stack trace of the root cause is available in the Apache
Tomcat/6.0.24 logs.
I guess this is typical of DSpace and I can hardy see any relation to
the OAI. See this for details:
https://wiki.duraspace.org/display/DSDOC18/Performance+Tuning+DSpace?from=YQMiAQ
In our case we might need to increase it a bit and see.
And a final question about your addon, when we load a new ontology in
search form, it must generate all semantic information by parsing OAI
request using custom XSLT stylesheet to get semantic triplets? It
works that way I suppose.
Yes it does.
And if it does, how it manages on big repository datasets, for example
with thousands of records, won't fill server memory such approach?
Our published experiments for up to 4500 records (200K triples) show
that this approach can scale well (at least for FaCT++) given the
precomputation. Still though we are investigating more scalable
approaches like triple stores.
Thank you for your patience and attention.
Best regards,
Dimitrios