zlacker

[parent] [thread] 3 comments
1. olavgg+(OP)[view] [source] 2021-11-26 10:52:39
I work in the oil and gas industry where legacy systems runs on their last breath. Kafka is a fantastic tool and solves a shit ton of problem. We have millions of sensors on an offshore installation, these all send data into kafka, where we generate events on new topics from different timeseries. Other data services consume these topics and get data updated in near realtime.

No more daily SQL dumps from offshore to onshore and big batch procedures to genereate outdated events.

replies(3): >>elcano+wd >>berkes+Zf >>fatbir+SM
2. elcano+wd[view] [source] 2021-11-26 13:10:25
>>olavgg+(OP)
What legacy systems is the oil and gas using? MQTT? OPC-DA? OPC-UA?
3. berkes+Zf[view] [source] 2021-11-26 13:39:03
>>olavgg+(OP)
Sounds like you have Serious Problems, for which Kafka is a very good solution.

For me, Kafka sits in the same area of solutions as Kubernetes, Hadoop clusters, or anything "webscale": you don't need it. Untill you do, but by then you'll (i) have Serious Problems which such systems solve and (ii) the manpower and budgets to fix them.

With which I don't mean to avoid Kafka at all costs. By all means, play around with it: if anything, the event-driven will teach you things that make your a better Rails/Flask/WordPress developer if that is what you do.

4. fatbir+SM[view] [source] 2021-11-26 17:27:47
>>olavgg+(OP)
I'm in the same situation in the paper making industry. Kafka is an almost perfect match for our needs: high volume, durable storage, decoupled stream processing.
[go to top]