On 01/22/2015 09:51 AM, Pedro Ferreira wrote:
> For huge, I mean very huge.
> Probably Gigabytes or Terabytes of data. Full data in Memory is not an
> I saw something about tables, I will take a deeper look.
Tables are basically only suitable for static data that must be accessed
on a single primary key which can be sorted (not necessarily unique).
You could use it for dynamic data if you can keep the sorted structure
by only adding new data at the end. Not sure whether the library
handles dynamic data changes though. I used it long ago to store
background knowledge in the days we had like a few 100 Mb main memory.
If Prolog itself doesn't suffice and your data is dynamic, you need to
access a proper DB. So, you have interfaces to BerkelyDB, sqllite or
distributed solutions through ODBC, optionally using CQL for nice and
clean access to complicated databases. Of course, you could use these
as background storage and cache life data in the Prolog database. That
will require a bit of programming, but might provide excellent performance.
Success --- Jan