行动是治愈恐惧的良药,而犹豫、拖延将不断滋养恐惧。

Kibana如何根据嵌套JSON进行查询分析过滤?

Kibana | 作者 迷途的攻城狮 | 发布于2017年06月21日 | 阅读数:11719

Kibana中查看JSON内容如下:
{
"_index": "filebeat-application_bizz-2017.06.21",
"_type": "application_bizz",
"_id": "AVzJIPePOT9kBmzuHiSA",
"_version": 1,
"_score": null,
"_source": {
"offset": 5847157,
"level": "BIZZ",
"input_type": "log",
"source": "/Users/lion/IdeaProjects/log4j/target/logs/bizz.log",
"thread": "main",
"message": "{\"content\":{\"date\":1498022993014,\"sex\":0,\"name\":\"test properties\",\"message\":\"30b94213-77b3-4d1e-bf62-cd1642ea934f\\n28d10c62-24fe-4b8e-a7a5-726a07d8a073\"},\"level\":\"BIZZ\",\"point\":\"com.dameng.test.Test:33\",\"systemName\":\"TestSystem\",\"thread\":\"main\",\"timestamp\":\"2017-06-21 13:29:53,014\"}",
"type": "application_bizz",
"content": {
"date": 1498022993014,
"sex": 0,
"name": "test properties",
"message": "30b94213-77b3-4d1e-bf62-cd1642ea934f\n28d10c62-24fe-4b8e-a7a5-726a07d8a073"
},
"point": "com.dameng.test.Test:33",
"tags": [
"beats_input_codec_plain_applied"
],
"@timestamp": "2017-06-21T05:29:53.014Z",
"systemName": "TestSystem",
"host": "LiondeMacBook-Pro.local",
"timestamp": "2017-06-21 13:29:53,014"
},
"fields": {
"@timestamp": [
1498022993014
]
},
"sort": [
1498022993014
]
}
Discover截图

WX20170621-133311@2x.png

 
部分mapping:
          "beat": {
"properties": {
"hostname": {
"type": "keyword",
"ignore_above": 1024
},
"name": {
"type": "keyword",
"ignore_above": 1024
},
"version": {
"type": "keyword",
"ignore_above": 1024
}
}
},
"content": {
"properties": {
"date": {
"type": "long"
},
"message": {
"type": "keyword",
"ignore_above": 1024
},
"name": {
"type": "keyword",
"ignore_above": 1024
},
"sex": {
"type": "long"
}
}
},
问题:我想让content里面的内容("date", "sex", "name,"message")像外面的内容(level,point,thread)一样,在左侧可见,可以单独过滤,排序等等,而不是把content字段当作一个整体字符串,能实现吗?
已邀请:

WangYahua

赞同来自:

您好,您的这个问题解决了吗?请教一下,十分感谢。

typuc - 80后IT男,乒乓球爱好者

赞同来自:

这个应该是logstash处理,logstash 可以对json做二次解析。message本身就是一个json,然后它里面的log_message又是个json.
我目前使用的配置:
 
input{
  kafka {
    bootstrap_servers => "xxxxx:9092"
    group_id => "test"
    topics => ["microservice_json"]
    decorate_events => true
    consumer_threads =>4
    codec => json
  }
}
filter {
  json {
    source => "message"
  }
 mutate {
    remove_field => ["message"]
  }
}
filter {
    json {
       source => "log_message"
    }
    mutate {
      remove_field => ["log_message"]
    }
 
}

 
 

要回复问题请先登录注册