I wonder if Kafka represents an existential angst in these Kubernetized Microservice times. Or is it more simply I am just too dumb to learn and use this shit correctly.
Related HN discussion of this [1]
[1] https://github.com/edenhill/librdkafka [2] https://github.com/Shopify/sarama
>the attempt to charge is recorded in a leger
Hint: how do you think this attempt is recorded and fulfilled? Or, do you think "it's just appended" and bank recalculates your balance from scratch every time you spend 1$ on coke can?
Only bank I've heard of that's not using traditional relational database for ledger is Monzo [1] - but they still use Cassandra's transactions.
[1] https://www.scaleyourapp.com/an-insight-into-the-backend-inf...
* Just keep your architecture a monolith. You'll do fine the majority of the cases.
* Event-sourcing doesn't require Kafka clusters. Nor do event-driven setups. You don't need complex tooling to pass around strings/json-blurps. An S3 bucket or a Postgresql database storing "Events-as-json" is often fine.
* Postgres can do most of what you need (except for the "webscale" clustering etc)[0] in practice already.
* Redis[1]
My main point is that while Kafka is a fantastic tool, you don't need that tool to achieve what you want in many cases.
> It seems as good a way as any to decouple systems
IMO relying on a tool to achieve a good software design, rather than design-patterns, is a recipe for trouble. If anything, because it locks you in (do you suddenly get a tightly coupled system if you remove Kafka?) or because its details force you into directions that don't naturally fit your domain or problem.
--
[0] https://spin.atomicobject.com/2021/02/04/redis-postgresql/ [1] https://redis.com/redis-best-practices/communication-pattern... etc.