Sonicwall logs

1,039 views
Skip to first unread message

ericc...@gmail.com

unread,
Feb 1, 2019, 10:02:01 AM2/1/19
to security-onion
Hi,

I'm not sure if this is necessarily an SO question or maybe better answered by Elastic, but here goes:

I've got SO running and for the most part everything seems okay. I've got alerts and things of that nature working the way that I expect, so far. The problem I'm having is the logs from the firewall. I've tried a handful of times to get them ingested into Kibana properly, but all I end up with is a bunch of blank visualizations on the Firewall dashboard because the logs aren't parsed properly, see attached.

Any help or nudges in the right direction would be greatly appreciated.

Thanks,
Eric
firewall_format.PNG

Wes Lambert

unread,
Feb 4, 2019, 7:55:01 AM2/4/19
to securit...@googlegroups.com
Have you tried looking for the firewall logs in Discover first, to see how they are getting typed, or interpreted, and what fields are present?

I would then compare those fields to the ones the visualizations are looking for (you can see this by editing the dashboard and clicking the visualization edit icon).

Thanks,
Wes

--
Follow Security Onion on Twitter!
https://twitter.com/securityonion
---
You received this message because you are subscribed to the Google Groups "security-onion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to security-onio...@googlegroups.com.
To post to this group, send email to securit...@googlegroups.com.
Visit this group at https://groups.google.com/group/security-onion.
For more options, visit https://groups.google.com/d/optout.


--

Eric Chiles

unread,
Feb 4, 2019, 9:26:59 AM2/4/19
to security-onion

Thanks for the reply Wes.
The way it's setup now, the firewall logs are being pulled in on 514 as syslog. I tried adding rules to logstash to add tags and have the output index show as "logstash-firewall-*". What I was hoping for is the "message" section to be pulled out into fields, rather than having all the information just dumped into one field that I can't do much with. Maybe my approach from the beginning is flawed?

sonicwall.PNG

Kevin Branch

unread,
Feb 5, 2019, 10:28:49 PM2/5/19
to securit...@googlegroups.com
The Logstash kv plugin does very nicely for me with my SonicWALL sites.  (https://www.elastic.co/guide/en/logstash/current/plugins-filters-kv.html)
I use it on a dedicated Wazuh system separate from SO in these cases but I'm sure you could fold it in with some tweaking about.

Here is an adapted snip from one of my Logstash filter sections involving SonicWALL syslog record parsing.  Maybe you will find it useful.

    if ( [agent][name] == "sonicwall" ) {

        # Extract the already-labelled key-value pairs from the log message
        kv {
                source => "message"
                target => "data"
        }

        # SonicWALL src field looks like "192.168.135.154:51364:X3" with sometimes a DNS name as a 4th field
        csv {
                source => "[data][src]"
                separator => ":"
                columns => ["srcip","srcport","srciface","srcname"]
                target => "data"
        }

        # SonicWALL dst field looks like "192.168.135.154:51364:X3" with sometimes a DNS name as a 4th field
        csv {
                source => "[data][dst]"
                separator => ":"
                columns => ["dstip","dstport","dstiface","dstname"]
                target => "data"
        }

    }


Eric Chiles

unread,
Feb 7, 2019, 9:15:34 AM2/7/19
to security-onion
Thanks for you input, Kevin. Hopefully I can tweak this to fit into my scenario.

Much appreciated.
Eric

Eric Chiles

unread,
Feb 18, 2019, 11:22:25 AM2/18/19
to security-onion
Just wanted to update you. I was able to use the kv snippet to massage the data into the firewall dashboard and get all the visualizations working. I appreciate your help with this!

Thanks so much.
Eric

Kevin Branch

unread,
Feb 18, 2019, 3:25:39 PM2/18/19
to securit...@googlegroups.com
I am delighted to hear that was helpful.   Please share back with the community how you used that to get the firewall dashboard working with your SonicWALL events.

Kevin

Eric Chiles

unread,
Feb 19, 2019, 1:54:21 PM2/19/19
to security-onion
I modified the premade 1004_preprocess_syslog_types.conf to tag my SonicWALL logs with "firewall":

user@HostName:/etc/logstash/conf.d$ sudo vim 1004_preprocess_syslog_types.conf
1 filter {
2 if "syslog" in [tags] {
3 # if [host] == "172.16.1.1"
4 if [syslog-sourceip] == "x.x.x.x" {
5 mutate {
6 # add_field => { "type" => "fortinet" }
7 add_tag => [ "firewall" ]
8 }
9 }
10 if [syslog-sourceip] == "y.y.y.y" {
11 mutate {
12 add_tag => [ "firewall" ]
13 }
14 }
15 if [host] == "10.0.0.101" {
16 mutate {
17 add_field => { "type" => "brocade" }
18 add_tag => [ "switch" ]
19 }
20 }
21 mutate {
22 #add_tag => [ "conf_file_1004"]
23 }
24 }
25 }
Then I added the kv filter you sent in the middle of the logstash confs at 5500.

user@HostName:/etc/logstash/conf.d$ sudo vim 5500_postprocess_sonicwall.conf
1 filter {
2 if "firewall" in [tags] {
3 # Extract key-value pairs from log message
4 kv {
5 source => "message"
6 target => "data"
7 }
8 # Sonicwall src field
9 csv {
10 source => "[data][src]"
11 separator => ":"
12 columns => ["srcip","srcport","srciface","srcname"]
13 target => "data"
14 }
15 # Sonicwall dst field
16 csv {
17 source => "[data][dst]"
18 separator => ":"
19 columns => ["dstip","dstport","dstiface","dstname"]
20 target => "data"
21 }
22 }
23 mutate {
24 # add_tag => ["conf_file_5500"]
25 }
26 }

In order for the data to work with the premade visualizations, some of the necessary fields needed to be renamed. I added this toward the latter part of logstash at 6800.

ser@HostName:/etc/logstash/custom$ sudo vim 6800_post_sonic_rename.conf
1 filter {
2 if "firewall" in [tags] {
3 mutate {
4 rename => { "[data][fw_action]" => "action" }
5 rename => { "[data][srcip]" => "source_ip" }
6 rename => { "[data][note]" => "reason" }
7 rename => { "[data][dstip]" => "destination_ip" }
8 rename => { "[data][proto]" => "ipv4_protocol" }
9 rename => { "[data][dst]" => "dst" }
10 rename => { "[data][dstMac]" => "dstMac" }
11 rename => { "[data][msg]" => "msg" }
12 rename => { "[data][srcMac]" => "srcMac" }
13 rename => { "[data][dstport]" => "destination_port" }
14 rename => { "[data][srcport]" => "source_port" }
15 }
16 }
17 # mutate {
18 # add_tag => [ "conf_file_6800" ]
19 # }
20 }
I'm sure it could be made a little prettier and I did add the confs at arbitrary points where I thought it made sense to put them.

As a side note, I also modified the 9034_output_syslog.conf to exclude anything that has the "firewall" tag because it was putting out logstash-syslog-* and logstash-firewall-*, effectively doubling the output.

firewall-vis.PNG

Judd Brown

unread,
Jun 24, 2019, 10:23:23 AM6/24/19
to security-onion
Hi Eric,

Thanks for this description. I see how it should work but have a couple of questions.

1) Are the /etc/logstash/conf.d mods made on the Master server, other, or both?
2) Do you download the KV plugin to that server?
3) Where is bin/logstash-plugin? I found one in each of /var/lib/docker/overlay2/<hash>/merged/usr/share/logstash and in /var/lib/docker/overlay2/<hash>/diff/usr/share/logstash. Neither of those seem likely but I could be wrong.

Thanks! Judd

Eric Chiles

unread,
Jun 24, 2019, 10:49:00 AM6/24/19
to security-onion
Forgive me, it's been several months since I got this all figured out, so I don't recall exactly what steps went it which order.

I only have a single server, so distributing the logstash configuration isn't something that I have experience with.

I don't believe that I had to do any plugin modifications. KV may be part of Security Onion by default. Kevin or Wes might be able to provide more insight on that. I did copy this setup (with a change noted below) to a new Security Onion server and it's working without doing anything with plugins.

As far as the logstash config files, it should be mentioned that changing the files in /etc/logstash/conf.d is not a best practice. I was early into my logstashing at the time I got this working. It's better to add files in the /etc/logstash/custom directory. You can usually fit them nicely into the sizable gaps in the number scheme in conf.d. This will prevent future conf updates from overwriting modified files in conf.d. Restarting logstash will create links to the custom files and fill them in in the proper sequence. For example, 1004 was updated and cleared out my changes, so I created /etc/logstash/custom/1005 that is a copy of the 1004 I posted above.

Hope this helps.
Eric

Judd Brown

unread,
Jun 25, 2019, 11:02:09 AM6/25/19
to security-onion
It does help. Thanks again!

Judd Brown

unread,
Jun 25, 2019, 11:04:15 AM6/25/19
to security-onion

Kevin or Wes, can you add any insight here?

Thanks!

Francois

unread,
Jun 25, 2019, 12:08:31 PM6/25/19
to security-onion
Personally, I'd like for the SO project to include more Logstash configs for some of the most often seen firewall logs. Cisco is missing right now so I'm going to work on creating the right config for that. I know that you already have pfsense in the distribution.

Can Eric's config be integrated in the project?

Thanks,

Francois

Doug Burks

unread,
Jul 2, 2019, 5:03:36 PM7/2/19
to securit...@googlegroups.com
Hi Francois,

We only have access to pfsense at the moment and that's why we only support pfsense logs currently.  If folks are able to develop configs to parse other formats and submit via github, we'll gladly review any pull requests.

Thanks!

--
Follow Security Onion on Twitter!
https://twitter.com/securityonion
---
You received this message because you are subscribed to the Google Groups "security-onion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to security-onio...@googlegroups.com.
To post to this group, send email to securit...@googlegroups.com.
Visit this group at https://groups.google.com/group/security-onion.

For more options, visit https://groups.google.com/d/optout.


--
Doug Burks
CEO
Security Onion Solutions, LLC

Aida Roig-Compton

unread,
Jul 24, 2019, 1:42:49 PM7/24/19
to security-onion
Hello Eric,
I see in your post that you've included the command "data.dstip". I'm working in Wazuh Kibana and I'm having trouble getting this to work. When followed by an IP address such as 123.45.6.789, what kind of results would you expect? What would be the purpose of such a search?
Thank you in advance!
Aida

Eric Chiles

unread,
Jul 24, 2019, 4:39:21 PM7/24/19
to security-onion
The 5500 config file was made specifically for my Sonicwalls to take data out of a field in the syslog and restructure it. Something like this.

Log as generated:
message sn=############ time="2019-07-24 16:07:49" fw=XXX.XXX.XXX.XXX pri=1 c=32 m=83 msg="Probable port scan detected" n=178 src=XXX.XXX.XXX.XXX:443:X1:i0.wp.com dst=XXX.XXX.XXX.XXX:47175:X1 srcMac=XX:XX:XX:XX:XX:XX dstMac=XX:XX:XX:XX:XX:XX proto=tcp/https note="TCP scanned port list, 37898, 40033, 57830, 50545, 19152, 58599, 4505, 7096, 59979, 47175" fw_action="NA"

During processing, the src and dst key/value pairs are in extracted and new fields are made, like this:
data.srcip=XXX.XXX.XXX.XXX data.srcport=443 data.srciface=X1 data.srcname=i0.wp.com

data.destip=XXX.XXX.XXX.XXX data.destport=47175 data.destiface=X1 and it skips data.destname because the destination is the device itself.

The 6800 config file renames some of these fields (and some of the standard fields from the original syslog). So rather than changing the Firewall dashboard visualization to look for data.srcip, I had logstash rename it to source_ip.

For Wazuh (or OSSEC, as the Dashboard shows), creating the data.whatever nested fields wouldn't be necessary because any ip addresses shouldn't be connected to port numbers and interfaces.

Sorry for the long post. I hope it helps.

Eric

Yoshi Hiradate

unread,
Oct 28, 2019, 5:43:23 PM10/28/19
to security-onion
Hello,

I know that this topic is months old, but I was wondering if you or anyone in this thread can give me some advice or tips on setting Sonicwall logs to Kibana via AWS cloud services.  I attempted to use Elastic search/Kibana, but the charts I get are not generating information as I would expect? I don't really know what I am doing since I am relatively new to this. I've done extensive research but am struggling on this.

Can any of you give me some advice?

Thank you.

Eric Chiles

unread,
Oct 29, 2019, 9:44:53 AM10/29/19
to security-onion
I had to rename some of the fields in the log in order for the visualizations to work properly. There were also a few of the visualizations that I had to modify to make sure they were pulling the right fields. For example, if the X-Axis bucket is looking for the field "action" I would have to change it to "action.keyword" instead.

I'm not at all familiar with having Kibana in AWS, so that bit someone else will have to chime in on.

Eric
Reply all
Reply to author
Forward
0 new messages