Adding a new Elasticsearch index and mapping.

775 views
Skip to first unread message

infos...@roadflares.org

unread,
May 25, 2018, 8:50:05 AM5/25/18
to security-onion
I'm attempting to add a new index to Elasticsearch to handle syslog data coming from a Palo Alto firewall, and I'm having trouble getting it working properly. I'm pretty sure that this line from the Security Onion Logstash support page is what I'm running afoul of:

"Currently, new fields that do not match the template are stored in Elasticsearch, however, they are not indexed, unless provided in a mapping template."

Running " curl -X GET "localhost:9200/paloalto-traffic/_mapping/_doc" " returns the expected mapping, so Elasticsearch appears to be aware of what it should be doing with the data, but Logstash isn't putting anything in there and Kibana doesn't recognize that the index exists. This mapping was manually added to Elasticsearch using the mapping portion of the attached file and the command " curl -X PUT "localhost:9200/paloalto-traffic" -H 'Content-Type: application/json' -d @mappings.txt "

I've attached the mapping template that I'm using to this post. This file is in /etc/logstash/custom, and is copied into /etc/logstash when Logstash restarts.

My securityonion.conf has been updated with this option:

LOGSTASH_OPTIONS="--volume /etc/logstash/custom/paloalto-traffic.json:/paloalto-traffic.json:ro"


I've added a custom parser in /etc/logstash/custom/6202_firewall_paloalto that breaks everything out of the syslog into the proper fields and tags it as "paloalto" and "paloalto-traffic".


I've added a custom output file in /etc/logstash/custom/9201_output_paloalto:

filter {
if "paloalto" in [tags] and "test_data" not in [tags] {
mutate {
##add_tag => [ "conf_file_9200"]
}
}
}
output {
if "paloalto-traffic" in [tags] and "test_data" not in [tags] {
# stdout { codec => rubydebug }
elasticsearch {
hosts => elasticsearch
index => "paloalto-traffic"
template_name => "paloalto-traffic"
template => "/paloalto-traffic.json"
# template_overwrite => true
}
}

Previously, when the data was being tagged as "syslog" and "syslogng" in addition to "paloalto" and "paloalto-traffic", the logs were being properly parsed and displayed in Kibana in the logstash-* index. When I added a directive to remove those tags after adding the paloalto ones, the results stopped showing up in Kibana.

So, I'm not sure if the issue is with my mapping file or if I've misconfigured something in my LOGSTASH_OPTIONS or my output file. Any help would be appreciated. I've been using the old ELSA-based Security Onion for a long time, but Elastic Stack is new to me.
paloalto-traffic.json

Wes Lambert

unread,
May 25, 2018, 3:23:28 PM5/25/18
to securit...@googlegroups.com
To clarify, the template you are referring to above is not exactly the template to which we are referring :)

Currently, we statically map fields with a mapping template to prevent field explosion.  This mapping template is called upon during index creation and applied at that time (you can see the respective output config files in /etc/logstash/conf.d , where it says "template => <template_name>'.  The location of the template in the output file is relative to the inside of the container, where the files are mounted, so, you should only need to refer to it as /template-name once mounted through Logstash options.

The default mapping template for logstash-* is /etc/logstash/logstash-template.json, currently, and for Beats, /etc/logstash/beats-template.json.  These files are stored in /etc/logstash, and if modified, should be copied, modified, and placed in /etc/logstash/custom.

I believe what you were referring to above was the "mapping" or "config" file.

If you do the following:

curl localhost:9200/_cat/indices

...do you see the index that you are meaning to create?

You'll want to place your custom mapping template (.conf) (as well as your custom mapping file (.json)), specific to your index in /etc/logstash/custom, then restart Logstash.  Additionally,  we currently have to add the template file (.conf) to Logstash options in /etc/nsm/securityonion.conf so it will be accessible.  In the future, we hope to remove this step.

You may be getting errors on the template name, etc. and it may not be creating the index at all.  Once you do get the index created, you will need to configure an index-pattern in Kibana to be able to view the data matching the specified index pattern.

Thanks,
Wes


--
Follow Security Onion on Twitter!
https://twitter.com/securityonion
---
You received this message because you are subscribed to the Google Groups "security-onion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to security-onion+unsubscribe@googlegroups.com.
To post to this group, send email to security-onion@googlegroups.com.
Visit this group at https://groups.google.com/group/security-onion.
For more options, visit https://groups.google.com/d/optout.



--

infos...@roadflares.org

unread,
May 31, 2018, 10:11:52 AM5/31/18
to security-onion
Thanks, Wes. I appreciate the swift reply.

It turns out that I had a made a silly mistake -- my output configuration in /etc/logstash/custom didn't have .conf at the end of the file name and so it wasn't being pulled in properly. Once I updated that, and changed the mapping from "_doc" to "doc", the data started importing into the proper index.

Thanks again for the help.

Cyrus Field

unread,
Jan 28, 2020, 3:54:17 AM1/28/20
to security-onion
Hey Wes, 

I am fairly new to the Onion and security engineering have been struggling to add a parser and mapping templates for Palo Alto. I was able to add sucessfully a grok filter .conf that I used parse through the syslogs I am getting from my Palo Alto but am looking to do something like this https://github.com/shadow-box/Palo-Alto-Networks-ELK-Stack I have found some resources, but most are incomplete when it comes to the onion. I have found instances for what I am looking to do that are specific for Elastic but I try to add them to the /etc/logstash/custom, logstash breaks from what I can tell from the logstash.log it appears as if it is looking for a standard elastic instance and not one from docker. Is there a better example I can find for getting the parser and templates I need for PAN-OS 9.x somehwhere that I am missing? I have gone through the Onion Docs dated 20 Jan 2020, and read the docs. Most of the links in readthedocs seem to point to standard Elastic builds again and are not Onion specific when it comes to modules, parsers, templates. 

Thanks, 

Cyrus 


To unsubscribe from this group and stop receiving emails from it, send an email to securit...@googlegroups.com.
To post to this group, send email to securit...@googlegroups.com.

Pete Halatsis

unread,
Sep 16, 2020, 8:21:06 PM9/16/20
to security-onion
Cyrus,

I am in the same boat! I am having trouble merging multiple docs together to get this all to work. I just want my Palo Alto syslog data to come in parsed correctly!

Reply all
Reply to author
Forward
0 new messages