想要从logstash实时读取数据输出到kafka, 而不是每次都全量去读可以实现么
input {
elasticsearch {
index => "filebeat*"
hosts => "http:/10.10.10.160:9200"
# query => '{ "query": { "match": { "statuscode": 200 } }, "sort": [ "_doc" ] }'
#user => "elastic"
#password => "changeme"
size => 500
scroll => "5m"
# docinfo => true
# docinfo_target => "[@metadata][doc]"
}
}
#filter {}
output {
stdout{}
kafka {
bootstrap_servers => "127.0.0.1:9092"
topic_id => "test-a-log"
}
}
这个配置每次都会把es数据全部塞到kafka, 想要实现有新数据的时候才会把数据输出到kafka, 可以实现么
input {
elasticsearch {
index => "filebeat*"
hosts => "http:/10.10.10.160:9200"
# query => '{ "query": { "match": { "statuscode": 200 } }, "sort": [ "_doc" ] }'
#user => "elastic"
#password => "changeme"
size => 500
scroll => "5m"
# docinfo => true
# docinfo_target => "[@metadata][doc]"
}
}
#filter {}
output {
stdout{}
kafka {
bootstrap_servers => "127.0.0.1:9092"
topic_id => "test-a-log"
}
}
这个配置每次都会把es数据全部塞到kafka, 想要实现有新数据的时候才会把数据输出到kafka, 可以实现么
1 个回复
locatelli
赞同来自:
你现在这种配置是read once。仔细读一下文档吧