I don't understand how the difference can be explained, but prob it has to do smth with garbage collection? As during the loading of the triples memory consumption was 100% and prob Source 2 just 'fitted' in memory.
That is enormous variation, far beyond what we've ordinarily seen.
My first guess is that you should run a database for which you want good performance on a real computer, not on a virtual one (where I/O can be very bad, etc).
Beyond that, it could be GC, it could be something else.
We'll take a look and see if we can reproduce.
Also when idle and Stardog running loaded memory usage is about 3GB! Will it work on the cloud actually on a small instance with let's say 512 MB available for stardog?