CQRS stands for Command Query Responsibility segregation, which means that you can use a different model to update information and a different model to read information. Typically by different model we mean actual object model and probably on different machine.
In this example I am using
- Kafka as event store (for event sourcing)
- MongoDB as Write store
- Could use Elasticsearch as Read store
- Spring Cloud Stream to abstract the event store and the publish/subscribe mechanism
This is actually a precursor project to applying Spring DataFlow which will handle
instance count, partitioning between consumer/producers and ad-hoc creation of
data micro-services using familiar Java API
and known semantics from Spring Integration like Sink, Source, Transformer
We enable an output message channel on the app using <!–
@Autowired private MessageChannel output; –> This is a small app backed by MongoDB. Spring Data provides special hooks like afterSave, afterDelete that are overridden in the application to apply event sourcing.
Consumer handles the event and from the producer and just logs it, but could eventually save it to elasticsearch for full text querying
You can find the code » here « . and set-up the small infrastructure through docker compose.