Trabajar con GEO_POINT

74 views
Skip to first unread message

Ivan Martinez

unread,
Jul 7, 2015, 6:49:30 AM7/7/15
to elastics...@googlegroups.com
Hola,

He actualizado ELK a las nuevas versiones y algunas cosas me fallan. He conseguido arreglar casi todas menos la geo.

Para la versión anterior, para cargar las coordenadas y que kibana trabajara con ellas bastaba con hacer un hash de las coordenadas.Pero en la nueva version se necesita un geo_point y no consigo transformarlo.

¿A alguien mas le ha pasado? ¿Sabeis como solucionarlo?


un saludo

Pere Urbon-Bayes

unread,
Jul 7, 2015, 7:30:56 AM7/7/15
to Ivan Martinez, elastics...@googlegroups.com
Hola,
  creo que estas buscando esto: https://www.elastic.co/guide/en/elasticsearch/reference/current//mapping-geo-point-type.html#_input_structure , aqui teienes los diferentes formatos del geo_point.

Por cierto, hace unas semanas creamos el foro https://discuss.elastic.co/c/in-your-native-tongue/elastic-en-espanol , por si os interesa hacer tambien preguntas allí :-) (disclaimer, trabajo para elastic)

Saludos

- purbon


--
Has recibido este mensaje porque estás suscrito al grupo "Elasticsearch ES" de Grupos de Google.
Para anular la suscripción a este grupo y dejar de recibir sus mensajes, envía un correo electrónico a elasticsearch-...@googlegroups.com.
Para acceder a más opciones, visita https://groups.google.com/d/optout.

Ivan Martinez

unread,
Jul 7, 2015, 8:12:39 AM7/7/15
to elastics...@googlegroups.com, ivmart...@gmail.com
Hola,

gracias por la info.
Lo que no consigo es integralo en logstash.

       mutate {
               add_field => [ "[pin][location]", "%{longitude}" ]
               add_field => [ "[pin][location]", "%{latitude}" ]
               convert => [ "[pin][location][longitude]", "float" ]
               convert => [ "[pin][location][latitude]", "float" ]
       }

¿Como se puede mapear?  ¿lo tengo que hacer por separado?

Pere Urbon-Bayes

unread,
Jul 7, 2015, 8:27:43 AM7/7/15
to Ivan Martinez, elastics...@googlegroups.com
El output the Logstash para elasticsearch te lo hace por defecto si tu propiedad se llama "geoip", mas info en: https://github.com/logstash-plugins/logstash-output-elasticsearch/blob/master/lib/logstash/outputs/elasticsearch/elasticsearch-template.json#L31

si no la mejor manera es pasar un mapping particular, al estilo del template por defecto.

- purbon

Ivan Martinez

unread,
Jul 9, 2015, 6:55:51 AM7/9/15
to elastics...@googlegroups.com, ivmart...@gmail.com
Hola,

sigo sin hacerlo funcionar.  :-( ni modificando el objeto geoip ni creándome mi propio objeto.

Los pasos que he seguido son los siguientes:

1. Editar el fichero de mapeos para crear el nuevo tipo
~/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/elasticsearch-template.json

{
  "template" : "logstash-*",
  "settings" : {
    "index.refresh_interval" : "5s"
  },
  "mappings" : {
    "_default_" : {
       "_all" : {"enabled" : true, "omit_norms" : true},
       "dynamic_templates" : [ {
         "message_field" : {
           "match" : "message",
           "match_mapping_type" : "string",
           "mapping" : {
             "type" : "string", "index" : "analyzed", "omit_norms" : true
           }
         }
       }, {
         "string_fields" : {
           "match" : "*",
           "match_mapping_type" : "string",
           "mapping" : {
             "type" : "string", "index" : "analyzed", "omit_norms" : true,
               "fields" : {
                 "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
               }
           }
         }
       } ],
       "properties" : {
         "@version": { "type": "string", "index": "not_analyzed" },
         "geoip"  : {
           "type" : "object",
             "dynamic": true,
             "properties" : {
               "location" : { "type" : "geo_point" }
             }
         },
        "gfigeo" : {
           "type" : "object",
             "dynamic": true,
             "properties" : {
               "location" : { "type" : "geo_point" }
             }
        }
       }
    }
  }
}

2. generar el fichero de configuración de logstash
input{
  stdin{
  }
}

filter {

#Build message
        grok {
                match => { "message" => "%{NUMBER:rownum};%{NUMBER:rownumsession};%{DATA:type};%{DATA:name};%{DATESTAMP:fecha};%{NUMBER:temperature};%{NUMBER:longitude};%{NUMBER:latitude};%{NUMBER:altitude}$" }
        }

#Build datestamp
        date{
                match => [ "fecha" , "MM/dd/yyyy HH:mm:ss" ]
                timezone => "Etc/GMT"
        }

#Convert datatipes
        mutate {
                convert => [ "temperature", "float" ]
                convert => [ "longitude", "float" ]
                convert => [ "latitude", "float" ]
                convert => [ "altitude", "float" ]
        }

#Build location
        mutate {
                add_field => [ "gfigeo[location]lon", "%{longitude}" ]
                add_field => [ "gfigeo[location]lat", "%{latitude}" ]
                convert => [ "gfigeo[location]lon", "float" ]
                convert => [ "gfigeo[location]lat", "float" ]
        }

}

output{
  file{
    path => "%{type}.%{+yyyy.MM.dd}.log"
  }

  elasticsearch { host => localhost }
}

3. correr el logstash

generacion de logs correcta => output.file

{"message":"0000002420;00000;gfirunner;Corredor3;06/25/2015 10:50:02;32.95;-4.40;40.69;0.00","@version":"1","@timestamp":"2015-06-25T10:50:02.000Z","host":"olivar","rownum":"0000002420","rownumsession":"00000","type":"gfirunner","name":"Corredor3","fecha":"06/25/2015 10:50:02","temperature":32.95,"longitude":-4.4,"latitude":40.69,"altitude":0.0,"gfigeo":{"location":{"lon":"-4.4","lat":"40.69"}}}
{"message":"0000002421;00001;gfirunner;Corredor3;06/25/2015 10:50:16;32.95;-4.40;40.69;0.00","@version":"1","@timestamp":"2015-06-25T10:50:16.000Z","host":"olivar","rownum":"0000002421","rownumsession":"00001","type":"gfirunner","name":"Corredor3","fecha":"06/25/2015 10:50:16","temperature":32.95,"longitude":-4.4,"latitude":40.69,"altitude":0.0,"gfigeo":{"location":{"lon":"-4.4","lat":"40.69"}}}



pero en los indices de kibana no lo carga correctamente

gfigeo.location.lat     string 
gfigeo.location.lon     string
gfigeo.location.lon.raw     string
gfigeo.location.lat.raw     string


pero el json devuelto si lo veo bien
{
  "_index": "logstash-2015.07.08",
  "_type": "gfirunner",
  "_id": "AU5yZkvrPdtvMjkTkMTT",
  "_score": null,
  "_source": {
    "message": "0000000438;00203;gfirunner;Corredor5;07/08/2015 11:06:00;29.95;-3.63;40.44;0.00",
    "@version": "1",
    "@timestamp": "2015-07-08T11:06:00.000Z",
    "host": "olivar",
    "rownum": "0000000438",
    "rownumsession": "00203",
    "type": "gfirunner",
    "name": "Corredor5",
    "fecha": "07/08/2015 11:06:00",
    "temperature": 29.95,
    "longitude": -3.63,
    "latitude": 40.44,
    "altitude": 0,
    "gfigeo": {
      "location": {
        "lon": "-3.63",
        "lat": "40.44"
      }
    }
  },
  "fields": {
    "@timestamp": [
      1436353560000
    ]
  },
  "sort": [
    1436353560000
  ]
}

Lo dicho, si me podeis echar un cable, estoy algo atascado.

Pere Urbon-Bayes

unread,
Jul 13, 2015, 5:51:38 AM7/13/15
to Ivan Martinez, elastics...@googlegroups.com
Hola,
   el campo geo_point debe tener un formato como este:




"location" => [
[0] 37.6156,
[1] 55.75219999999999

]

Pere Urbon-Bayes

unread,
Jul 13, 2015, 5:54:12 AM7/13/15
to Ivan Martinez, elastics...@googlegroups.com
Le di a esto de enviar demasiado rapido, lo que deica:

Hola,
   el campo geo_point debe tener un formato como este:


"location" => [ 
[0] 37.6156,
[1] 55.75219999999999
]

el que genera tu configuracion no es valido, para hacerlo prueva a sustituir tus filtros mutate por algo asi:


  ruby {
          code => '
              event["geoip"] = []
              event["geoip"] << event["longitude"]
              event["geoip"] << event["latitude"]
          '
      }

esto generara con la longitud y la latitud un campo de nombre geoip de typo geo_point. El resto, para que ES lo entienda como tal ya sabes hacerlo.

Saludos,

- purbon


Ivan Martinez

unread,
Jul 15, 2015, 4:18:28 AM7/15/15
to elastics...@googlegroups.com, ivmart...@gmail.com
Hola,

Muchas gracias por la ayuda.
He añadido lo que me has comentado, generar con el filtro en ruby el campo geo_point

En la salida por fichero ya se ve bien:

{"message":"0000000235;00000;gfirunner;Corredor5;07/08/2015 10:16:48;25.62;-3.63;40.44;695.00","@version":"1","@timestamp":"2015-07-08T10:16:48.000Z","host":"olivar","rownum":"0000000235","rownumsession":"00000","type":"gfirunner","name":"Corredor5","fecha":"07/08/2015 10:16:48","temperature":25.62,"longitude":-3.63,"latitude":40.44,"altitude":695.0,"geoip":[-3.63,40.44]}

sin embargo sigue sin cargarlo en elasticsearch
Da este error: failed action with response of 400, dropping action

Estoy mirando por internet y es un error que debe ser muy común, Seguramente tenga mal configurado el elasticsearch

{:timestamp=>"2015-07-15T09:54:07.181000+0200", :message=>"failed action with response of 400, dropping action: [\"index\", {:_id=>nil, :_index=>\"logstash-2015.05.12\", :_type=>\"gfirunner\", :_routing=>nil}, #<LogStash::Event:0x1d06da95 @metadata={\"retry_count\"=>0}, @accessors=#<LogStash::Util::Accessors:0x6d58f86a @store={\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, @lut={\"host\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"host\"], \"message\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"message\"], \"rownum\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"rownum\"], \"rownumsession\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"rownumsession\"], \"type\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"type\"], \"name\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"name\"], \"fecha\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"fecha\"], \"temperature\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"temperature\"], \"longitude\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"longitude\"], \"latitude\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"latitude\"], \"altitude\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"altitude\"], \"@timestamp\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"@timestamp\"], \"geoip\"=>[{\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, \"geoip\"]}>, @data={\"message\"=>\"0000000001;00000;gfirunner;Corredor1;05/12/2015 14:16:22;32.15;0.00;0.00;0.00\", \"@version\"=>\"1\", \"@timestamp\"=>\"2015-05-12T14:16:22.000Z\", \"host\"=>\"olivar\", \"rownum\"=>\"0000000001\", \"rownumsession\"=>\"00000\", \"type\"=>\"gfirunner\", \"name\"=>\"Corredor1\", \"fecha\"=>\"05/12/2015 14:16:22\", \"temperature\"=>32.15, \"longitude\"=>0.0, \"latitude\"=>0.0, \"altitude\"=>0.0, \"geoip\"=>[0.0, 0.0]}, @metadata_accessors=#<LogStash::Util::Accessors:0x632ae7c8 @store={\"retry_count\"=>0}, @lut={}>, @cancelled=false>]", :level=>:warn}
Reply all
Reply to author
Forward
0 new messages