Seeking Feedback on a Big Data Architecture Involving Kogito for accounting Auditing

33 views
Skip to first unread message

FRANCISCO HELIO DA CUNHA JUNIOR

unread,
Apr 11, 2024, 8:08:04 AMApr 11
to Kogito development mailing list

I am considering developing a solution for auditing accounting information that involves millions of tax documents.

The structure should allow for each accounting document to go through a rule engine maintained by the business area and generate alerts.

The architecture would be:

  1. Airflow to check when a tax document arrives.
  2. Pre-processing of the tax document data by querying the data.
  3. These data are saved in Kafka.
  4. Kafka receives the events and sends them to Kogito.
  5. Kogito would have a series of rules to be executed in parallel and series. The idea is that the user could potentially modify some rules.
  6. Kogito would have a series of rules to be executed in parallel and series. The idea is that the user could potentially modify some rules.
  7. After the rules are processed, Kogito sends the output back to Kafka’s consumer.
  8. Kafka saves it in a non-relational database.
  9. An alert application reads from the non-relational database.

I have some questions about using Kogito and architecture:

  • Does using Kogito make sense in this big data architecture?
  • Is the idea of allowing users to change the rules viable?
  • How can the input data be parameterized for Kogito?
  • Do you have any suggestions for the architecture?
Reply all
Reply to author
Forward
0 new messages