欢迎来到尧图网

客户服务 关于我们

您的位置:首页 > 汽车 > 维修 > Flume和kafka的整合:使用Flume将日志数据抽取到Kafka中

Flume和kafka的整合:使用Flume将日志数据抽取到Kafka中

2024/11/29 21:35:29 来源:https://blog.csdn.net/lzhlizihang/article/details/144035525  浏览:    关键词:Flume和kafka的整合:使用Flume将日志数据抽取到Kafka中

文章目录

    • 1、Kafka作为Source【数据进入到kafka中,抽取出来】
    • 2、kafka作为Sink 【数据从别的地方抽取到kafka里面】


1、Kafka作为Source【数据进入到kafka中,抽取出来】

kafka源 --> memory --> 控制台:

a1.sources = r1
a1.sinks = k1
a1.channels = c1# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1a1.sources.r1.type = org.apache.flume.source.kafka.KafkaSource
a1.sources.r1.batchSize = 100
a1.sources.r1.batchDurationMillis = 2000
a1.sources.r1.kafka.bootstrap.servers = node01:9092,node02:9092,node03:9092
a1.sources.r1.kafka.topics = five
a1.sources.r1.kafka.consumer.group.id = donghu# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100a1.sinks.k1.type = logger
a1.sinks.k1.maxBytesToLog = 128

2、kafka作为Sink 【数据从别的地方抽取到kafka里面】

netcat --> memory -->kafka:

##a1就是flume agent的名称
## source r1
## channel c1
## sink k1
a1.sources = r1
a1.sinks = k1
a1.channels = c1# Describe/configure the source
a1.sources.r1.type = netcat
a1.sources.r1.bind = bigdata01
a1.sources.r1.port = 44444# 修改sink为kafka
a1.sinks.k1.type = org.apache.flume.sink.kafka.KafkaSink
a1.sinks.k1.kafka.bootstrap.servers = node01:9092
a1.sinks.k1.kafka.topic = five
a1.sinks.k1.kafka.producer.acks = 1
a1.sinks.k1.kafka.producer.linger.ms = 1# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1

版权声明:

本网仅为发布的内容提供存储空间,不对发表、转载的内容提供任何形式的保证。凡本网注明“来源:XXX网络”的作品,均转载自其它媒体,著作权归作者所有,商业转载请联系作者获得授权,非商业转载请注明出处。

我们尊重并感谢每一位作者,均已注明文章来源和作者。如因作品内容、版权或其它问题,请及时与我们联系,联系邮箱:809451989@qq.com,投稿邮箱:809451989@qq.com