Remote Filebeat to Security Onion log stash problems

469 views
Skip to first unread message

ryan reid

unread,
Apr 6, 2020, 5:43:02 PM4/6/20
to security-onion
So Im getting the errors below even though my filebeat instance says it will work and can communicate to the remote server.  But for some reason filebeat won't start.

ryan@ryan-OptiPlex-990:/etc/filebeat/modules.d$ sudo filebeat test config
Exiting: error initializing publisher: error initializing processors: the processor add_host_metadata doesn't exist
ryan@ryan-OptiPlex-990:/etc/filebeat/modules.d$ cd ..
ryan@ryan-OptiPlex-990:/etc/filebeat$ ls
data        filebeat.reference.yml  LICENSE.txt  modules.d
fields.yml  filebeat.yml            logs         NOTICE.txt
filebeat    kibana                  module       README.md
ryan@ryan-OptiPlex-990:/etc/filebeat$ sudo vi filebeat.yml
ryan@ryan-OptiPlex-990:/etc/filebeat$ sudo filebeat test config
Config OK
ryan@ryan-OptiPlex-990:/etc/filebeat$ sudo filebeat test output
logstash: 10.0.0.71:5044...
  connection...
    parse host... OK
    dns lookup... OK
    addresses: 10.0.0.71
    dial up... OK
  TLS... WARN secure connection disabled
  talk to server... OK
ryan@ryan-OptiPlex-990:/etc/filebeat$ sudo filebeat run
Exiting: Registry file path must be a file. /var/lib/filebeat/registry is a directory.

Some early guidance help would be nice, I've been through the manuals but i seem to be missing something.

ryan reid

unread,
Apr 7, 2020, 8:14:01 AM4/7/20
to security-onion
Okay so i did more digging, but still can't figure out where to update the registry path at. it isn't in the yml file.  my registry file is in data since i used a tar.gz but filebeat is looking for it elsewhere in var/lib/filebeat where do i change this configuration at?

Michael Givens

unread,
Apr 7, 2020, 8:53:58 AM4/7/20
to security-onion
Ryan,

Can I ask why you chose filebeat over winlogbeat?  Are you running web servers or something to that nature (to gather flat log files) on your windows boxes?

Or are you just wanting to get windows event logs off of the windows boxes?

Thanks - Mike

ryan reid

unread,
Apr 7, 2020, 9:08:53 AM4/7/20
to security-onion
Mike,

I chose filebeat because im not running windows.  I have filebeat installed on a ubuntu 18.04 machine, my ELK stack is running on a Security Onion VM.

ryan reid

unread,
Apr 7, 2020, 9:12:49 AM4/7/20
to security-onion
Mike,

Right now i am trying to figure out how to get the processors to work and what needs to be done to call them properly in the filebeat.yml file as well as the placement of the registry call out in yml file.  so i don't keep getting the wrong registry path.

Ive tried adding the following the my filebeat.yml config file.  nothing has seemed to work.

"
path.data: /etc/filebeat/data
filebeat.registry_file: ${path.data}/registry
filebeat.registry_file_permissions: 0600

"



Wes Lambert

unread,
Apr 7, 2020, 11:21:55 AM4/7/20
to securit...@googlegroups.com
Please share the entire Filebeat config file, redacting as necessary.  Additionally, what version of FB are you using?

--
Follow Security Onion on Twitter!
https://twitter.com/securityonion
---
You received this message because you are subscribed to the Google Groups "security-onion" group.
To unsubscribe from this group and stop receiving emails from it, send an email to security-onio...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/security-onion/dfb548fa-e8e2-4ad0-987e-a5eb699966af%40googlegroups.com.


--

ryan reid

unread,
Apr 7, 2020, 2:28:22 PM4/7/20
to securit...@googlegroups.com
Thanks for the quick response, I've been trying like crazy to get this to work.

root@ryan-OptiPlex-990:/etc/filebeat# filebeat version
filebeat version 6.0.0 (amd64), libbeat 6.0.0


##################### Filebeat Configuration Example #########################

# This file is an example configuration file highlighting only the most common
# options. The filebeat.reference.yml file from the same directory contains all the
# supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/filebeat/index.html

# For more available modules and options, please see the filebeat.reference.yml sample
# configuration file.


path.data: /etc/filebeat/data
filebeat.registry_file: ${path.data}/registry
filebeat.registry_file_permissions: 0600

#=========================== Filebeat inputs =============================

filebeat.inputs:

# Each - is an input. Most options can be set at the input level, so
# you can use different inputs for various configurations.
# Below are the input specific configurations.

- type: log

  # Change to true to enable this input configuration.
  enabled: false

  # Paths that should be crawled and fetched. Glob based paths.
  paths:
    - /var/log/*.log
    #- c:\programdata\elasticsearch\logs\*

  # Exclude lines. A list of regular expressions to match. It drops the lines that are
  # matching any regular expression from the list.
  #exclude_lines: ['^DBG']

  # Include lines. A list of regular expressions to match. It exports the lines that are
  # matching any regular expression from the list.
  #include_lines: ['^ERR', '^WARN']

  # Exclude files. A list of regular expressions to match. Filebeat drops the files that
  # are matching any regular expression from the list. By default, no files are dropped.
  #exclude_files: ['.gz$']

  # Optional additional fields. These fields can be freely picked
  # to add additional information to the crawled log files for filtering
  #fields:
  #  level: debug
  #  review: 1

  ### Multiline options

  # Multiline can be used for log messages spanning multiple lines. This is common
  # for Java Stack Traces or C-Line Continuation

  # The regexp Pattern that has to be matched. The example pattern matches all lines starting with [
  #multiline.pattern: ^\[

  # Defines if the pattern set under pattern should be negated or not. Default is false.
#multiline.negate: false

  # Match can be set to "after" or "before". It is used to define if lines should be append to a pattern
  # that was (not) matched before or after or as long as a pattern is not matched based on negate.
  # Note: After is the equivalent to previous and before is the equivalent to to next in Logstash
  #multiline.match: after


#============================= Filebeat modules ===============================

filebeat.config.modules:
  # Glob pattern for configuration loading
  path: ${path.config}/modules.d/*.yml

  # Set to true to enable config reloading
  reload.enabled: false

  # Period on which files under path should be checked for changes
  #reload.period: 10s

#==================== Elasticsearch template setting ==========================

setup.template.settings:
  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false

#================================ General =====================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their own field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging


#============================== Dashboards =====================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here, or by using the `-setup` CLI flag or the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboards archive. By default this URL
# has a value which is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

#============================== Kibana =====================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.

setup.kibana:
  host: "http://10.0.0.71:5601/app/kibana"

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

#============================= Elastic Cloud ==================================

# These settings simplify using filebeat with the Elastic Cloud (https://cloud.elastic.co/).

# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:

# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:

#================================ Outputs =====================================

# Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
  #hosts: ["localhost:9200"]

  # Enabled ilm (beta) to use index lifecycle management instead daily indices.
  #ilm.enabled: false

  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["10.0.0.71:5044"]
  tls:
    ssl.certificate_authorities: ["/etc/ssl/certs/securityonion.pem"]
  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

#================================ Processors =====================================

# Configure processors to enhance or manipulate events generated by the beat.

    # processors:
    # - add_host_metadata:
    netinfo.enabled: true

    # - add_cloud_metadata: ~

#================================ Logging =====================================

# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
#logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors use ["*"]. Examples of other selectors are "beat",
# "publish", "service".
#logging.selectors: ["*"]

#============================== Xpack Monitoring ===============================
# filebeat can export internal metrics to a central Elasticsearch monitoring
# cluster.  This requires xpack monitoring to be enabled in Elasticsearch.  The
# reporting is disabled by default.

# Set to true to enable the monitoring reporter.
#xpack.monitoring.enabled: false

# Uncomment to send the metrics to Elasticsearch. Most settings from the
# Elasticsearch output are accepted here as well. Any setting that is not set is
# automatically inherited from the Elasticsearch output configuration, so if you
# have the Elasticsearch output configured, you can simply uncomment the
# following line.
#xpack.monitoring.elasticsearch:


Best Regards,

Ryan Reid


Wes Lambert

unread,
Apr 7, 2020, 4:32:27 PM4/7/20
to securit...@googlegroups.com
First thing -- change the version of Filebeat you are using to align with the current version of Elastic you are using (the version you are using is too old and may cause issues):

You can get the version on your system by running the following:

curl -s localhost:9200

(assuming it will be something like 6.8.6)

Second, disable TLS for now, until you verify you can get the current stuff working -- there is additional config needed on the Logstash side for this to work as intended, and it may not be there or configured correctly.

Last, with regard to the registry issue, make sure there is not an empty folder in the way, preventing the registry file from being written,

Let me know if that helps.

Thanks,
Wes

ryan reid

unread,
Apr 7, 2020, 7:24:33 PM4/7/20
to securit...@googlegroups.com
Wes, 

Thanks for the tips, Ill definitely try them, as far as the registry issue goes, i know where the registry file is and its looking for it in the totally wrong location.

When you do filebeat run, it looks for it in the /var/lib/filebeat/registry but thats a directory not a file.  since filebeat was installed via a tar.gz zip file the location of the registry file is in /etc/filebeat/data.  so i was looking for how to change where it looks.


Best Regards,

Ryan Reid


ryan reid

unread,
Apr 8, 2020, 8:23:40 AM4/8/20
to securit...@googlegroups.com
Morning Wes,

I was wiped out after a while yesterday.  So i did the following this morning and this is what i got in the log.  So when I am on the SO vm and i try to sniff the packets, i don't see anything even coming in. But it seems to say that the connection has been established?

2020-04-08T08:12:08.195-0400    INFO    log/harvester.go:255    Harvester started for file: /var/log/auth.log
2020-04-08T08:12:09.197-0400    ERROR   logstash/async.go:256   Failed to publish events caused by: write tcp 10.0.0.128:48406->10.0.0.71:5044: write: connection reset by peer
2020-04-08T08:12:10.838-0400    ERROR   pipeline/output.go:121  Failed to publish events: write tcp 10.0.0.128:48406->10.0.0.71:5044: write: connection reset by peer
2020-04-08T08:12:10.838-0400    INFO    pipeline/output.go:95   Connecting to backoff(async(tcp://10.0.0.71:5044))
2020-04-08T08:12:10.838-0400    INFO    pipeline/output.go:105  Connection to backoff(async(tcp://10.0.0.71:5044)) established


Best Regards,

Ryan Reid


On Tue, Apr 7, 2020 at 4:32 PM Wes Lambert <wlamb...@gmail.com> wrote:

Wes Lambert

unread,
Apr 8, 2020, 8:28:03 AM4/8/20
to securit...@googlegroups.com
To be clear, what steps have you taken since my last response?

On the Security Onion box, what mode did you use when running setup?

What is the output of the following from the node?

grep MINIMAL /etc/nsm/securityonion.conf

ryan reid

unread,
Apr 8, 2020, 8:33:26 AM4/8/20
to securit...@googlegroups.com
My SO box is running in production mode and i have done so-allow's to all ports required for elastic and logstash.  So since your last direction, I have done the following. 

Verified my SO box elk stack as 6.8.6
Upgrade my ubuntu box filebeat to filebeat 6.8.6
Commented out the TLS and the processors
The filebeat ran on the ubuntu box without giving me the registry error again, in the log file it says it couldn't find one so it created a new one.
The output i gave you was from the Ubuntu box running filebeat remotely from the SO instance.


Best Regards,

Ryan Reid


Wes Lambert

unread,
Apr 8, 2020, 9:18:49 AM4/8/20
to securit...@googlegroups.com
What is the output of the following from the node?

grep MINIMAL /etc/nsm/securityonion.conf


ryan reid

unread,
Apr 8, 2020, 12:01:03 PM4/8/20
to securit...@googlegroups.com
There was no output from running that on the security onion node.  however on my ubuntu the netstat output shows:

Proto Recv-Q Send-Q Local Address           Foreign Address         State      
tcp        0      0 ryan-OptiPlex-990:48472 10.0.0.71:5044          ESTABLISHED

the 71 address is security onion ip and says its connect to logstash.


Best Regards,

Ryan Reid


Reply all
Reply to author
Forward
0 new messages