Skip to main content
Apache Kafka is a distributed event streaming platform used for high-performance data pipelines, streaming analytics, and data integration. Apache Kafka Connect is a tool to scalably and reliably stream data between Apache Kafka® and other data systems. Kafka Connect is an ecosystem of pre-written and maintained Kafka Producers (source connectors) and Kafka Consumers (sink connectors) for data products and platforms like databases and message brokers. This guide explains how to set up Kafka and Kafka Connect to stream data from a Kafka topic into your .

Prerequisites

To follow the steps on this page:
  • Create a target with time-series and analytics enabled.

    You need your connection details. This procedure also works for .

Install and configure Apache Kafka

To install and configure Apache Kafka: Keep these terminals open, you use them to test the integration later.

Install the sink connector to communicate with

To set up Kafka Connect server, plugins, drivers, and connectors:

Create a table in your to ingest Kafka events

To prepare your for Kafka integration:

Create the sink

To create a sink in Apache Kafka:

Test the integration with

To test this integration, send some messages onto the accounts topic. You can do this using the kafkacat or kcat utility. You have successfully integrated Apache Kafka with .