filebeat和ELK全用了5.0了,kafka是0.10.1.0。
filebeat产生的消息用output:kafka到达kafka后是这样:
{"@timestamp":"2016-11-28T06:13:29.168Z","beat":{"hostname":"localhost.localdomain","name":"192.168.138.3","version":"5.0.1"},"input_type":"log","message":"Nov 28 09:09:47 localhost kernel: type=1305 audit(1480295387.539:3): audit_pid=1196 old=0 auid=4294967295 ses=4294967295 res=1","offset":1475347,"source":"/var/log/messages","type":"syslog"}
logstash用input-kafka后是这样:
{
"@timestamp" => 2016-11-28T06:11:44.896Z,
"geoip" => {},
"@version" => "1",
"message" => "{\"@timestamp\":\"2016-11-28T06:13:29.168Z\",\"beat\":{\"hostname\":\"localhost.localdomain\",\"name\":\"192.168.138.3\",\"version\":\"5.0.1\"},\"input_type\":\"log\",\"message\":\"Nov 28 09:09:47 localhost kernel: type=1305 audit(1480295387.539:3): audit_pid=1196 old=0 auid=4294967295 ses=4294967295 res=1\",\"offset\":1475347,\"source\":\"/var/log/messages\",\"type\":\"syslog\"}",
"tags" => [
[0] "_grokparsefailure",
[1] "_geoip_lookup_failure"
]
}
全部都在message里了,完全没有其他beat的field。
beat通过kafka到logstash,要怎么解析呢?我这是只用了input-kafka,可以嵌套个input-beats吗?
filter里面是没有beat插件的
filebeat产生的消息用output:kafka到达kafka后是这样:
{"@timestamp":"2016-11-28T06:13:29.168Z","beat":{"hostname":"localhost.localdomain","name":"192.168.138.3","version":"5.0.1"},"input_type":"log","message":"Nov 28 09:09:47 localhost kernel: type=1305 audit(1480295387.539:3): audit_pid=1196 old=0 auid=4294967295 ses=4294967295 res=1","offset":1475347,"source":"/var/log/messages","type":"syslog"}
logstash用input-kafka后是这样:
{
"@timestamp" => 2016-11-28T06:11:44.896Z,
"geoip" => {},
"@version" => "1",
"message" => "{\"@timestamp\":\"2016-11-28T06:13:29.168Z\",\"beat\":{\"hostname\":\"localhost.localdomain\",\"name\":\"192.168.138.3\",\"version\":\"5.0.1\"},\"input_type\":\"log\",\"message\":\"Nov 28 09:09:47 localhost kernel: type=1305 audit(1480295387.539:3): audit_pid=1196 old=0 auid=4294967295 ses=4294967295 res=1\",\"offset\":1475347,\"source\":\"/var/log/messages\",\"type\":\"syslog\"}",
"tags" => [
[0] "_grokparsefailure",
[1] "_geoip_lookup_failure"
]
}
全部都在message里了,完全没有其他beat的field。
beat通过kafka到logstash,要怎么解析呢?我这是只用了input-kafka,可以嵌套个input-beats吗?
filter里面是没有beat插件的
5 个回复
dennishood
赞同来自: medcl
jhin
赞同来自:
jhin
赞同来自:
forsaken627 - 90后打字员
赞同来自:
wangohyes
赞同来自:
[2016-12-08T23:03:50,678][ERROR][logstash.inputs.kafka ] Unknown setting 'topic_id' for kafka
[2016-12-08T23:03:50,678][ERROR][logstash.inputs.kafka ] Unknown setting 'reset_beginning' for kafka
用的kafka0.10.0.0,logstash版本是5.0.2,启动会bao报错,有人遇到这个问题了嘛?