About 9,380 results
Open links in new tab
  1. Apache Kafka

    Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and …

  2. Apache Kafka

    The Connector API allows building and running reusable producers or consumers that connect Kafka topics to existing applications or data systems. For example, a connector to a relational …

  3. Apache Kafka

    Has no external dependencies on systems other than Apache Kafka itself as the internal messaging layer; notably, it uses Kafka's partitioning model to horizontally scale processing …

  4. Project information - Apache Kafka

    Apache Kafka, Kafka, and the Kafka logo are either registered trademarks or trademarks of The Apache Software Foundation in the United States and other countries.

  5. Powered By - Apache Kafka

    Apache Kafka is used for both real-time and batch data processing, and is the chosen event log technology for Amadeus microservice-based streaming applications. Kafka is also used for …

  6. Apache Kafka

    Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and …

  7. Apache Kafka

    Apache Kafka 4.0 is a significant milestone, marking the first major release to operate entirely without Apache ZooKeeper®. By running in KRaft mode by default, Kafka simplifies …

  8. Documentation - Apache Kafka

    Kafka Streams is a client library of Kafka for real-time stream processing and analyzing data stored in Kafka brokers. This quickstart example will demonstrate how to run a streaming …

  9. Apache Kafka

    We use Kafka, Kafka Connect, and Kafka Streams to enable our developers to access data freely in the company. Kafka Streams powers parts of our analytics pipeline and delivers endless …

  10. Apache Kafka

    As the first step, we will start Kafka (unless you already have it started) and then we will prepare input data to a Kafka topic, which will subsequently be processed by a Kafka Streams application.