I would like to implement the outbox pattern with the following concepts:
- event generating service (let's call it Generator) will save them to its db in one transaction with the data change (the principle of the outbox pattern)
- there would be a separate event dispatcher service (let's call it Dispatcher) that would pull the generated events from the db periodically and sent them (synchronously, HTTP) to the external service managing Kafka (black box for me), then mark them as sent in the db
- there would be a REST API implemented that would allow manual creation and modification of events in case such need occurs (a simple CRUD)
- the expected traffic is rather small (max 10k requests a day)
As I understand, this can be realised in two ways.
One approach is for both services to share a db. This should require less work to implement and is generally simpler. The problem with this is as follows:
- I don't really have experience in such situations and I don't entirely know what problems to expect (apart from the fact that I should make sure that enough connections are available)
- sharing a db between microservices is an anti-pattern in my head (although I'm probably biased)
- I'm not sure which service should have the REST API for the events as both seem equally right (or equally wrong) for that
The second approach is to leave the Dispatcher service without the db access and communicate with the Generator service through an API. This approach seems much clearer for me not only because the db is not shared, but now the Dispatcher service has a clear responsibility of just communicating with the external service, while the Generator service becomes the single owner of the events.
However what I don't like about this idea is adding the additional network layer with all its potential problems (latency, errors, etc.).
What would be the recommended approach in this situation?