There are two bounded contexts. Registration and Billing.
There is a process manager between these two contexts.
This process manager listens to events from the Registration context and transforms them into commands that are sent to Billing context.
Registration
UsageWasRegistered
articleThatWasUsed
numberOfArticlesUsed
userThatUsedTheArticle
A read model aggregates these events by user and article.
User | Article | Quantity
---- | ------- | --------
John | Beer | 2
John | Water | 1
Lisa | Wine | 5
These aggregated usages can be sent to the billing context...
SendUsageForUserToBilling
John
...which results in an event...
UsagesForUserWereSentToBilling
John
...which manipulates the read model and marks those usages as "sent to billing".
User | Article | Quantity | Sent to billing
---- | ------- | -------- | ----------------
John | Beer | 2 | yes
John | Water | 1 | yes
Lisa | Wine | 5 | no
## Process Manager
The process manager listens to the UsagesForUserWereSentToBilling event and has a dependency on the read model that contains the aggregated usages. When a UsagesForUserWereSentToBilling event is received, the read model is queried by userId and re resulting rows are used to create a command that will be sent to the Billing context.
The problem I'm having is that both the process manager and the read model are subscribed to the same event, UsagesForUserWereSentToBilling. It is entirely possible that the process manager receives this event before the read model does. In this case, the process manager queries to read model for John's usages that were sent to billing. The read model hasn't been updated, which means that there are no usages for John that are ready to be sent to billing.
There are couple of things I have thought about:
Order the subscribers so that the read model gets the event first. This is currently running in production.
I don't want to do this because I feel like it creates a complex system.
At the moment, this entire process of sending command, dispatching event, updating read model, process manager, send command takes place synchronous in one big transaction. However, it is entirely possible that we'll be forced to go async in the future. Ordering the subscribers won't work async.
Have a process check the usages read model periodically and send commands.
This would make the entire process async, as said before, I kind of like that everything happens in one big transaction right now so I'd rather not go this route yet.
What are your thoughts?