Hello team. I've been working on POC for the project that requires rules processing for big data. Data is sitting in Delta Lake.
What would be the best way to load facts from delta table, considering the fact those records are huge(original JSON files have thousands keys/each object)?
I would start with Ergo's Json interface to read in the Json term
and convert it into a bunch of queriable objects.
What are the best practices on creating rules from big regulation files? IS there any tutorials on creating rules in general?
Are these rules already encoded in Json? If so, after the above conversion, use the created objects to reassemble the rules in the valid Ergo syntax.
If the rules need to be developed from scratch, then it is art. Analyze the problem, decide on the schema, what is given, what needs to be derived, etc.
We have a tutorial explaining the process using a real life example:
https://sites.google.com/a/coherentknowledge.com/tutorial-capturing-real-world-knowledge
--
--- michael
It's in the Guide to ErgoAI Packages, Ch 11.
Also, the Examples Bank includes a fairly extensive example of using Json in ErgoAI.
https://github.com/ErgoAI
--
--- michael