GET /_analyze
{
"analyzer" : "ik_max_word",
"text" : "AQ0556-200"
}
{
"tokens": [
{
"token": "aq0556-200",
"start_offset": 0,
"end_offset": 10,
"type": "LETTER",
"position": 0
},
{
"token": "aq",
"start_offset": 0,
"end_offset": 2,
"type": "ENGLISH",
"position": 1
},
{
"token": "0556",
"start_offset": 2,
"end_offset": 6,
"type": "ARABIC",
"position": 2
},
{
"token": "200",
"start_offset": 7,
"end_offset": 10,
"type": "ARABIC",
"position": 3
}
]
}
用户搜索使用使用关键词并没有加‘_’, "AQ0556 200"
GET /_analyze
{
"analyzer" : "ik_smart",
"text" : "AQ0556 200"
}
"tokens": [
{
"token": "aq0556",
"start_offset": 0,
"end_offset": 6,
"type": "LETTER",
"position": 0
},
{
"token": "200",
"start_offset": 7,
"end_offset": 10,
"type": "ARABIC",
"position": 1
}
]
}
这时候出现的结果是有问题的根本出不来数据,并且明明已经加了"_"停止词,index时候也没有作用,index索引分词的时候难道不应该也分词一份"aq0556" ?
1 个回复
God_lockin
赞同来自: