居然是你

使用logstash 同步mysql的数据到es,使用自定义模板报400错误

Logstash | 作者 zw_es | 发布于2018年05月02日 | 阅读数:14534

如题,在使用logstash 同步mysql的数据到es时,由于需要将其中某个字段设置为keyword类型,于是采用了自定义的动态模板,但出现了这个问题,一直无法解决
[2018-05-02T15:41:08,959][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-05-02T15:41:08,975][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>"D://elasticsearch/logstash/importMysql/state.json"}
[2018-05-02T15:41:08,992][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"myindex"=>{"order"=>0, "version"=>61111, "index_patterns"=>["myindex*"], "settings"=>{"index"=>{"refresh_interval"=>"5s"}}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}, "aliases"=>{}}}}
[2018-05-02T15:41:09,026][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/myindex
[2018-05-02T15:41:09,079][ERROR][logstash.outputs.elasticsearch] Failed to install template. {:message=>"Got response code '400' contacting Elasticsearch at URL 'http://127.0.0.1:9200/_template/myindex'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:290:in `perform_request_to_url'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:277:in `block in perform_request'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:372:in `with_connection'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:276:in `perform_request'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:284:in `block in Pool'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:338:in `template_put'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/http_client.rb:82:in `template_install'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:21:in `install'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/template_manager.rb:9:in `install_template'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/common.rb:57:in `install_template'", "D:/elasticsearch/logstash/logstash-6.2.3/vendor/bundle/jruby/2.3.0/gems/logstash-output-elasticsearch-9.0.3-java/lib/logstash/outputs/elasticsearch/common.rb:26:in `register'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/output_delegator_strategies/shared.rb:9:in `register'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/output_delegator.rb:42:in `register'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:341:in `register_plugin'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:352:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:352:in `register_plugins'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:735:in `maybe_setup_out_plugins'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:362:in `start_workers'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:289:in `run'", "D:/elasticsearch/logstash/logstash-6.2.3/logstash-core/lib/logstash/pipeline.rb:249:in `block in start'"]}
[2018-05-02T15:41:09,089][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://127.0.0.1:9200"]}
[2018-05-02T15:41:09,453][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x1fe15aeb sleep>"}
The stdin plugin is now waiting for input:
[2018-05-02T15:41:09,546][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
求教这个问题是怎么回事,另外es是6.2.2版本,jdk1.8,logstash6.2.3
另外执行的conf:
input {
stdin {
}
jdbc {
jdbc_connection_string => "jdbc:mysql://localhost:3306/alarm"
jdbc_user => "root"
jdbc_password => "root"
jdbc_driver_library => "D:\elasticsearch\logstash\importMysql\mysql-connector-java-5.1.38.jar"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_paging_enabled => "true"
jdbc_page_size => "50000"
statement_filepath => "D:\elasticsearch\logstash\importMysql\state_temp.sql"
schedule => "* * * * *"
}
}

filter {
date {

match => ["createTime", "yyyy-MM-dd HH:mm:ss"]

target => "@timestamp"

}
}

output {
elasticsearch {
hosts => "http://127.0.0.1:9200&quot;
index => "myindex"
document_id => "%{id}"
document_type => "mytype"
manage_template => true
template_overwrite => true
template_name => "myindex"
template => "D://elasticsearch/logstash/importMysql/state.json"
}
}
动态模板:
{
"myindex": {
"order": 0,
"version": 61111,
"index_patterns": ["myindex*"],
"settings": {
"index": {
"refresh_interval": "5s"
}
},
"mappings": {
"_default_": {
"dynamic_templates": [{
"message_field": {
"path_match": "message",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false
}
}
},
{
"string_fields": {
"match": "alarmNo",
"match_mapping_type": "string",
"mapping": {
"type": "text",
"norms": false,
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
}
}
}],
"properties": {
"@timestamp": {
"type": "date"
},
"@version": {
"type": "keyword"
},
"geoip": {
"dynamic": true,
"properties": {
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
"latitude": {
"type": "half_float"
},
"longitude": {
"type": "half_float"
}
}
}
}
}
},
"aliases": {

}
}
}
已邀请:

ddeason

赞同来自: txwdwyq

同样遇到了Failed to install template. {:message=>"Got response code '400' 的错误,纠结两个小时,搞通了。
logstash里的流程是,它把你写的json文件,当作template_name => "myindex" 存到elasticsearch上去。然后ogstash去读一下这个template,看看是否读取成功,400标示执行 get _template/myindex的时候出错了,读不到。
读不到的原因应该是json文件里有语法错误,前面put _template/myindex就已经失败了。
 
解决办法,弄个kibana:
执行 put _template/test
{
   ..... //你json的内容
}
如果put不成功,说明语法有问题,一点点删做定位,慢慢改吧,es本身的提示很模糊。
如果put成功了,表示你的json没问题,然后贴回去.json文件中再试,就应该可以了。

medcl - 今晚打老虎。

赞同来自:

     hosts => "http://127.0.0.1:9200&amp;quot;
这里的配置有问题,修改下。

要回复问题请先登录注册