找到问题的解决办法了么?

json格式日志时间戳无法解析

Logstash | 作者 qvitt | 发布于2017年11月03日 | 阅读数:6577

一段json格式的日志,比如{"name":"李四","sex":"女","年龄":20,"生日":"2016-01-02 23:12:13"},
当我用
input {
file {
type => "json_test"
path => "/logs/test/json.log"
codec => json {
charset => "UTF-8"
}
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
grok {
match => [ "生日" , "%{TIMESTAMP_ISO8601:[@metadata][timestamp]}" ]
}
date {
match => [ "[@metadata][timestamp]" , "yyyy-MM-dd HH:mm:ss" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => "10.26.222.213:9200"
index => "logstash-%{[type]}-%{+YYYY}"
}
stdout {
codec => rubydebug
}
}
中文和各字段输出都没有问题,就是时间戳无法解析成功;结果为:
{
"path" => "/logs/test/json.log",
"@timestamp" => 2017-11-03T05:57:31.523Z,
"sex" => "女",
"name" => "李四",
"生日" => "2016-01-02 23:12:13",
"@version" => "1",
"host" => "qvit",
"年龄" => 20,
"type" => "json_test",
"tags" => [
[0] "_grokparsefailure"
]
}
当我用
input {
file {
type => "json_test"
path => "/logs/test/json.log"
start_position => "beginning"
sincedb_path => "/dev/null"
}
}
filter {
json {
source => "message"
}
grok {
match => [ "生日" , "%{TIMESTAMP_ISO8601:[@metadata][timestamp]}" ]
}
date {
match => [ "[@metadata][timestamp]" , "yyyy-MM-dd HH:mm:ss" ]
target => "@timestamp"
}
}
output {
elasticsearch {
hosts => "10.26.222.213:9200"
index => "logstash-%{[type]}-%{+YYYY}"
}
stdout {
codec => rubydebug
}
}
结果如下,字段显示没问题,时间戳也能解析出来,但是字段名是中文的始终都是乱码
{
"path" => "/logs/test/json.log",
"ç\u0094\u009Fæ\u0097¥" => "2016-01-02 23:12:13",
"@timestamp" => 2016-01-02T15:12:13.000Z,
"å¹´é¾\u0084" => 20,
"sex" => "女",
"@version" => "1",
"host" => "qvit",
"name" => "李四",
"message" => "{\"name\":\"李四\",\"sex\":\"女\",\"年龄\":20,\"生日\":\"2016-01-02 23:12:13\"}",
"type" => "json_test"
}
已邀请:

要回复问题请先登录注册