- You haven't yet told us your server specs. Tell us as much as you can, in particular exact GDB version, amount of RAM, whether you have SSD.
- What reasoning do you use?
- What's the total number of triples (explicit and inferred: that info is shown in a tooltip over the repo name)?
- If you want to see property and class info, load the ontologies. But set NO reasoning before that, since all consequences are already made
My guess is that you're doing plenty of useless inference. We use rather specific inference, see
http://vocab.getty.edu/doc/#Inference in particular
Reduced SKOS Inference and Hierarchical Relations Inference.
So you're best off without any inference.
http://vocab.getty.edu/doc/#Total_Exports says "Because it includes all required Inference, you can load it to any repository (even one without RDFS reasoning)" but now that I reread it, it doesn't explicitly say "use no inference".
7d is unacceptably bad and not comparable to the 5h it took Plamen, so once we diagnose it here, we'll raise a GraphDB support ticket.