使用ELK做日志采集
file -> logstash -> es ->kibana
下面是 logstash里的配置文件
input {
file{
ignore_older => 600
stat_interval => 2
close_older => 300
path => ["/usr/local/xxx/xx/logs/smartfox.log.*"]
exclude => "*.zip"
codec => plain {charset => "ISO-8859-1"}
}
}
filter{
if([message] !~ "ANALYSISLOG="){
ruby {
code => "event.cancel"
}
}
mutate{
split => ["message","ANALYSISLOG="]
add_field => {
"analysis_json" => "%{[message][1]}"
}
add_field => {
"message_head" => "%{[message][0]}"
}
}
json {
source => "analysis_json"
}
mutate{
split => ["message_head","|"]
add_field => {
"serverid" => "%{[message_head][4]}"
}
}
remove_field => ["message","@version","host","path","message_head","analysis_json"]
}
output{
#stdout { codec => dots}
elasticsearch {
hosts => "192.168.1.101:9002"
index => "logstash-test-%{+YYYY.MM.dd}"
flush_size => 200
idle_flush_time => 2
}
}
在使用 kibana 查询时,发现过滤过的日志有部分 event 事件丢失了,请各位帮忙看看脚本哪儿是否有问题
thanks
file -> logstash -> es ->kibana
下面是 logstash里的配置文件
input {
file{
ignore_older => 600
stat_interval => 2
close_older => 300
path => ["/usr/local/xxx/xx/logs/smartfox.log.*"]
exclude => "*.zip"
codec => plain {charset => "ISO-8859-1"}
}
}
filter{
if([message] !~ "ANALYSISLOG="){
ruby {
code => "event.cancel"
}
}
mutate{
split => ["message","ANALYSISLOG="]
add_field => {
"analysis_json" => "%{[message][1]}"
}
add_field => {
"message_head" => "%{[message][0]}"
}
}
json {
source => "analysis_json"
}
mutate{
split => ["message_head","|"]
add_field => {
"serverid" => "%{[message_head][4]}"
}
}
remove_field => ["message","@version","host","path","message_head","analysis_json"]
}
output{
#stdout { codec => dots}
elasticsearch {
hosts => "192.168.1.101:9002"
index => "logstash-test-%{+YYYY.MM.dd}"
flush_size => 200
idle_flush_time => 2
}
}
在使用 kibana 查询时,发现过滤过的日志有部分 event 事件丢失了,请各位帮忙看看脚本哪儿是否有问题
thanks
0 个回复