Hi,
I'm considering using neo4j and I've looked at
hardware sizing calculator. It shows that for 5M nodes / 120M relations (each with 8 bytes of properties) I'll need at least 75GB of memory, which is huge.
In my case I want to run some analytical queries on the dataset. Just several concurrent queries at most. Latency is not critical. The queries I need to run are: for given set of nodes find all reachable nodes through relations meeting given condition. The queries should return just several vertices on average and never more than ten of thousands.
The main question I have is if neo4j can run with less memory (5GB? 10GB?) and instead read from disk when required?
How does it compare to regular relational databases here - I assume that it's still faster, but is that true?
Is the 75GB estimation by the calculator reasonable?
Thanks,
Marcin