How to sort alphanumeric and numeric values in a field in elasticsearch [duplicate] - elasticsearch

I am trying to create mapping for a field which will need to be used only for exact matches and sorting. I don't want to set primary data type as text as I need to do only exact match.
{
"index_patterns": "*",
"mappings": {
"doc": {
"_source": {
"enabled": true
},
"properties": {
"my_field": {
"type": "keyword",
"index": true,
"fielddata": true
}
}
}
}
}
But I am getting the following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Mapping definition for [my_field] has unsupported parameters: [fielddata : true]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [doc]: Mapping definition for [my_field] has unsupported parameters: [fielddata : true]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "Mapping definition for [my_field] has unsupported parameters: [fielddata : true]"
}
},
"status": 400
}
The problem is that without fielddata sorting is no happening properly. For example following is the sample asc sort output:
"90000001"
"90000001"
""
""
""
"90000008"
"9100000"

You don't need to set fielddata: true for keyword fields. You can also remove index: true as it is the default setting for keyword fields:
Simply like this:
{
"index_patterns": "*",
"mappings": {
"doc": {
"_source": {
"enabled": true
},
"properties": {
"my_field": {
"type": "keyword"
}
}
}
}
}

Related

Need to sort a field with numeric and alpha numeric values in elasticsearch

I am trying to create mapping for a field which will need to be used only for exact matches and sorting. I don't want to set primary data type as text as I need to do only exact match.
{
"index_patterns": "*",
"mappings": {
"doc": {
"_source": {
"enabled": true
},
"properties": {
"my_field": {
"type": "keyword",
"index": true,
"fielddata": true
}
}
}
}
}
But I am getting the following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Mapping definition for [my_field] has unsupported parameters: [fielddata : true]"
}
],
"type": "mapper_parsing_exception",
"reason": "Failed to parse mapping [doc]: Mapping definition for [my_field] has unsupported parameters: [fielddata : true]",
"caused_by": {
"type": "mapper_parsing_exception",
"reason": "Mapping definition for [my_field] has unsupported parameters: [fielddata : true]"
}
},
"status": 400
}
The problem is that without fielddata sorting is no happening properly. For example following is the sample asc sort output:
"90000001"
"90000001"
""
""
""
"90000008"
"9100000"
You don't need to set fielddata: true for keyword fields. You can also remove index: true as it is the default setting for keyword fields:
Simply like this:
{
"index_patterns": "*",
"mappings": {
"doc": {
"_source": {
"enabled": true
},
"properties": {
"my_field": {
"type": "keyword"
}
}
}
}
}

ElasticSearch action_request_validation_exception

I'm creating mapping for multiple type
here my query
PUT opl_consultation/_mapping
my json mapping file
{
"mappings": {
"article": {
"properties": {
"numero_noeud": { "type": "text" },
"intitule_fr": { "type": "text" },
"path_audio": { "type": "text" }
}
},
"hierarchie": {
"properties": {
"id_type_noeud_hie": { "type": "integer" },
"noeud_numero_hie": { "type": "text" },
"intitule_hie_fr": { "type": "text" }
}
},
"law_type": {
"properties": {
"id_type_loi": { "type": "integer" },
"Desc_law_type": { "type": "text" }
}
}
}
}
below the error a got
"type": "action_request_validation_exception",
"reason": "Validation Failed: 1: mapping type is missing;"
},
"status": 400
}
the version is Elasticsearch\6.4.2
In Elasticsearch 6.4.2 you cannot have more than one mapping type. See https://www.elastic.co/guide/en/elasticsearch/reference/current/removal-of-types.html
If you run your query instead as PUT opl_consultation with your mapping definition you will get the below error
"type": "illegal_argument_exception",
"reason": "Rejecting mapping update to [opl_consultation] as the final mapping would have more than 1 type: [law_type, article, hierarchie]"
Instead, use a custom type field as described here

In Elasticsearch, can I set language-specific multi-fields?

Is it possible to use multi-fields to set and query multilingual fields?
Consider this mapping:
PUT multi_test
{
"mappings": {
"data": {
"_field_names": {
"enabled": false
},
"properties": {
"book_title": {
"type": "text",
"fields": {
"english": {
"type": "text",
"analyzer": "english"
},
"german": {
"type": "text",
"analyzer": "german"
},
"italian": {
"type": "text",
"analyzer": "italian"
}
}
}
}
}
}
}
I tried the following, but it doesn't work:
PUT multi_test/data/1
{
"book_title.english": "It's good",
"book_title.german": "Das gut"
}
The error seems to indicate I'm trying to add new fields:
{ "error": { "root_cause": [ { "type": "mapper_parsing_exception",
"reason": "Could not dynamically add mapping for field
[book_title.english]. Existing mapping for [book_title] must be of
type object but found [text]." } ], "type":
"mapper_parsing_exception", "reason": "Could not dynamically add
mapping for field [book_title.english]. Existing mapping for
[book_title] must be of type object but found [text]." }, "status":
400 }
What am I doing wrong here?
If my approach is unworkable, what is a better way to do this?
The problem is that you are using using fields for the field book_title.
Fields keyword is used when you want to keep same field and data in multiple ways i.e using different analyzers or some other setting changes but values should be same in all field names under fields.Here is the link describing what is keyword fields https://www.elastic.co/guide/en/elasticsearch/reference/2.4/multi-fields.html
In you use case the mapping should be like below
PUT multi_test
{
"mappings": {
"data": {
"_field_names": {
"enabled": false
},
"properties": {
"book_title": {
"properties": {
"english": {
"type": "text",
"analyzer": "english"
},
"german": {
"type": "text",
"analyzer": "german"
},
"italian": {
"type": "text",
"analyzer": "italian"
}
}
}
}
}
}
}
This will define book_title as object type and you can add multiple fields with different data under book_title

failed to parse timestamp elasticsearch

I'm new to elasticsearch and am trying to create my first index but am having issues with a timestamp field that was working before...
I created my index like this:
PUT /kafkasdp
{
"mappings": {
"kafka_logs": {
"properties": {
"timestamp": {
"type": "date"
},
"log_level": {
"type": "string"
},
"message1": {
"type": "string"
},
"message2": {
"type": "string"
}
}
}
}
}
and then I'm trying to send data like this:
post /kafkasdp/kafka_logs
{
"timestamp": "2017-02-03 19:27:20,606",
"log_level": "INFO",
"message2": "Deleting segment 1 from log omega-replica-sync-dev-8. (kafka.log.Log)"
}
but keep getting this error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "failed to parse [timestamp]"
}
],
"type": "mapper_parsing_exception",
"reason": "failed to parse [timestamp]",
"caused_by": {
"type": "illegal_argument_exception",
"reason": "Invalid format: \"2017-02-03 19:27:20,606\" is malformed at \" 19:27:20,606\""
}
},
"status": 400
}
I thought my timestamp is a valid date type?
Read about date type on Elasticsearch reference: you should specify format of date you are expecting in your documents:
PUT your_index_name
{
"mappings": {
"your_index_type": {
"properties": {
"date": {
"type": "date",
"format": "yyyy-MM-dd HH:mm:ss,SSS"
}
}
}
}
}
As you did not specicy it, Elasticsearch will expect date value in ISO format:
yyyyMMdd'T'HHmmss.SSS'Z' (e.g., 2017-02-03T19:27:20.606Z)

Update and search in multi field properties in ElasticSearch

I'm trying to use multi field properties for multi language support. I created following mapping for this:
{
"mappings": {
"product": {
"properties": {
"prod-id": {
"type": "string"
},
"prod-name": {
"type": "string",
"fields": {
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
}
}
}
}
}
}
}
I created test record:
{
"prod-id": "1234567",
"prod-name": [
"Test product",
"Produit d'essai"
]
}
and tried to query using some language:
{
"query": {
"bool": {
"must": [
{"match": {
"prod-name.en": "Produit"
}}
]
}
}
}
As a result I got my document. But I expected that I will have empty result when I use French but choose English. It seems ElasticSearch ignores which field I specified in query. There is no difference in search result when I use "prod-name.en" or "prod-name.fr" or just "prod-name". Is this behaviour expected? Should I do some special things to have searching just in one language?
Another problem with updating multi field property. I can't update just one field.
{
"doc" : {
"prod-name.en": "Test"
}
}
I got following error:
{
"error": {
"root_cause": [
{
"type": "mapper_parsing_exception",
"reason": "Field name [prod-name.en] cannot contain '.'"
}
],
"type": "mapper_parsing_exception",
"reason": "Field name [prod-name.en] cannot contain '.'"
},
"status": 400
}
Is there any way to update just one field in multi field property?
In your mapping, the prod-name.en field will simply be analyzed using the english analyzer and the same for the french field. However, ES will not choose for you which value to put in which field.
Instead, you need to modify your mapping like this
{
"mappings": {
"product": {
"properties": {
"prod-id": {
"type": "string"
},
"prod-name": {
"type": "object",
"properties": {
"en": {
"type": "string",
"analyzer": "english"
},
"fr": {
"type": "string",
"analyzer": "french"
}
}
}
}
}
}
}
and input document to be like this and you'll get the results you expect.
{
"prod-id": "1234567",
"prod-name": {
"en": "Test product",
"fr": "Produit d'essai"
}
}
As for the updating part, your partial document should be like this instead.
{
"doc" : {
"prod-name": {
"en": "Test"
}
}
}

Resources