a1.sinks.k1.type = hdfs a1.sinks.k1.hdfs.path = hdfs://localhost:8020/flume/event/hdfs/date=%Y-%m_%d a1.sinks.k1.hdfs.rollSize = 10240 a1.sinks.k1.hdfs.rollInterval = 0 a1.sinks.k1.hdfs.rollCount = 0 a1.sinks.k1.hdfs.useLocalTimeStamp = true # bind source and sink to t...
flume也是日志采集器,类似于elk中的logstash。通过flume采集过来的数据,一方面我们可以把数据写入hdfs,我们也可以把采集的数据写入到mq当中,例如kafka,kafka在写入到主流的流处理中。 组件1:r1-采集组件,测试使用netcat 组件2:k1-输出组件 组件3:c1-缓存组件,测试使用内存做缓冲 flume提供了kafka集成方案,在flume中添...
[root@hadoop102 job]$ vim flume-file-hdfs.conf 添加如下内容 # Name the components on this agent a2.sources = r2 a2.sinks = k2 a2.channels = c2 # Describe/configure the source a2.sources.r2.type = exec a2.sources.r2.command = tail -F /opt/module/hive/logs/hive.log a2.sources.r2....