Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Kafka is a streaming log service. People who use it just for passing messages are hugely overengineering something that would fit just fine for something like RabbitMQ, Redis or other tools.

An RSS feed must be powered by something underneath, and any of those tools can do the job. In most situations it would be extremely impractical to use a relational database for this kind of thing. If you are getting thousands of messages in per second, which is not uncommon, no transactional database will give you enough write performance and won't be able to handle too many queries per second for clients polling for updates, like an RSS feed would require. Note that caching queries is almost useless here because the latest content is updated every few milisseconds.

Kafka, as pretty much any other queueing datastore, is optimized for append-only writes with no deletes or updates. Reading from the end of the queue is extremely fast and sequential reads down the stream are quite fast too. Random reads such as the ones commonly handled by SQL databases are either not available or are less efficient than with SQL.

That said, Kafka can be used to pass any kind of message between applications: from simple text messages and small JSON data to vĂ­deo frames, protobuf messages and other types of chunks of serialized data.

It is also a very durable data store for immutable, time-ordered data, and is widely used in the financial world to store transaction logs.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: