Kafka使用Elasticsearch Service Sink Connector直接传输topic数据到Elasticsearch

发布于:2025-06-20 ⋅ 阅读:(16) ⋅ 点赞:(0)

链接:Elasticsearch Service Sink Connector for Confluent Platform | Confluent Documentation

链接:Apache Kafka

一、搭建测试环境

下载Elasticsearch Service Sink Connector

https://file.zjwlyy.cn/confluentinc-kafka-connect-elasticsearch-15.0.0.zip

为了方便,使用docker搭建kafka和elasticsearch。

docker run -d --name elasticsearch   -e "discovery.type=single-node"   -e ES_JAVA_OPTS="-Xms512m -Xmx512m"   -p 9200:9200 -p 9300:9300   docker.elastic.co/elasticsearch/elasticsearch:7.17.1

docker run --user root -d --name kafka -p 9092:9092 -p 8083:8083 apache/kafka:3.9.1

confluentinc-kafka-connect-elasticsearch-15.0.0.zip文件复制到kafka容器里

docker cp confluentinc-kafka-connect-elasticsearch-15.0.0.zip kafka:/opt/connectors   

进入kafka的容器

docker exec -it kafka /bin/bash

修改配置文件

vi /opt/kafka/config/connect-standalone.properties

plugin.path=/opt/connectors   #修改为zip解压路径

解压zip

unzip confluentinc-kafka-connect-elasticsearch-15.0.0.zip

修改配置文件

vi /opt/connectors/confluentinc-kafka-connect-elasticsearch-15.0.0/etc/quickstart-elasticsearch.properties
# 基础配置
name=t-elasticsearch-sink
connector.class=io.confluent.connect.elasticsearch.ElasticsearchSinkConnector
tasks.max=3  # 根据分区数调整
topics=t-elasticsearch-sink
key.ignore=true
key.converter=org.apache.kafka.connect.storage.StringConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

# ES连接配置
connection.url=http://192.168.1.1:9200  # 多节点负载均衡
type.name=_doc
index.name=t-elasticsearch-sink
#index.auto.create=true  # 自动创建索引(或手动预创建)
schema.ignore=true

# 容错与错误处理
errors.tolerance=all
#errors.deadletterqueue.topic.name=dlq_t4_elasticsearch  # 必须配置DLQ
#errors.deadletterqueue.context.headers.enable=true  # 保留错误上下文
behavior.on.null.values=IGNORE  # 跳过空值消息

# 性能优化
batch.size=2000  # 批量写入提升吞吐
max.in.flight.requests=5  # 并发请求数
max.retries=10  # 失败重试次数
retry.backoff.ms=5000  # 重试间隔
read.timeout.ms=10000  # 读超时
connection.timeout.ms=10000  # 连接超时
flush.timeout.ms=30000  # 刷新超时[2](@ref)

 启动Connector

#cd /opt/kafka/bin

#./connect-standalone.sh -daemon ../config/connect-standalone.properties /opt/connectors/confluentinc-kafka-connect-elasticsearch-15.0.0/etc/quickstart-elasticsearch.properties

二、查看Connector状态

curl -XGET http://localhost:8083/connectors/t-elasticsearch-sink/status  #查看状态

curl -XGET http://localhost:8083/connectors/t-elasticsearch-sink/config   #查看配置

curl -X DELETE http://localhost:8083/connectors/t-elasticsearch-sink/offsets  #清理偏移量

curl -X DELETE http://localhost:8083/connectors/t-elasticsearch-sink   #删除此connectors

三、测试写入

./kafka-topics.sh --bootstrap-server 127.0.0.1:9092 --list   #查看topics

./kafka-topics.sh --delete --topic t-elasticsearch-sink    #删除topic

./kafka-console-producer.sh --bootstrap-server 127.0.0.1:9092 --topic t-elasticsearch-sink  #逐行写入消息 

四、查看ES索引

curl http://127.0.0.1:9200/_cat/indices?v


网站公告

今日签到

点亮在社区的每一天
去签到