Logstash not working with SG - no permissions for indices:data/write/bulk

838 views
Skip to first unread message

Ross Heilman

unread,
Nov 10, 2016, 8:45:06 AM11/10/16
to Search Guard
Hi - anyone know what could be going wrong here - I assigned my logstash user sg_all_access, but in my logstash logs I am getting - no permissions for indices:data/write/bulk
Here is my sg_all_access config which is working fine for my kibana user:
 1 sg_all_access:
 2   cluster:
 3     - CLUSTER_ALL
 4   indices:
 5     '*':
 6       '*':
 7         - ALL

Here is my logstash config:
  1 output {
  2
  3   elasticsearch {
  4     user => logstash
  5     password => xxxxxxxxxxxx
  6     ssl => true
  7     ssl_certificate_verification => true
  8     truststore => "/etc/elasticsearch/truststore.jks"
  9     truststore_password => 'xxxxxxxxxxx'
 11    index => "qa-logs-%{+YYYY.MM.dd}"


And here is the error I am getting:

{:timestamp=>"2016-11-10T06:40:46.961000+0000", :message=>"[403] {\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"no permissions for indices:data/write/bulk\"}],\"t      ype\":\"security_exception\",\"reason\":\"no permissions for indices:data/write/bulk\"},\"status\":403}", :class=>"Elasticsearch::Transport::Transport::Errors::Forbidden", :backtrace=>["/op      t/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in `__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby/1.      9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib      /elasticsearch/transport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client.      rb:128:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.      9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in `non_threadsafe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-e      lasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/      logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-jav      a/lib/logstash/outputs/elasticsearch/common.rb:172:in `safe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/c      ommon.rb:101:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:86:in `retrying_submit'", "/opt/      logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `multi_receive'", "org/jruby/RubyArray.java:1653:in `each_s      lice'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in `multi_receive'", "/opt/logstash/vendor/bundl      e/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/      output_delegator.rb:114:in `multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "org/jruby/RubyHash.java:1      342:in `each'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-co      re-2.3.3-java/lib/logstash/pipeline.rb:232:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :level=      >:warn}

Any ideas? - Thank you! 

SG

unread,
Nov 10, 2016, 10:19:47 AM11/10/16
to search...@googlegroups.com
Whats your Search Guard version?
> --
> You received this message because you are subscribed to the Google Groups "Search Guard" group.
> To unsubscribe from this group and stop receiving emails from it, send an email to search-guard...@googlegroups.com.
> To post to this group, send email to search...@googlegroups.com.
> To view this discussion on the web visit https://groups.google.com/d/msgid/search-guard/eb9311ae-c746-43c9-9fa5-b9022a506455%40googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Ross Heilman

unread,
Nov 10, 2016, 3:07:47 PM11/10/16
to Search Guard


 

Yes, thank you.

Here is the version we installed-

/usr/share/elasticsearch/bin/plugin install -b com.floragunn/search-guard-2/2.3.3.7



Here is another error I am seeing when we changed permissions to:

sg_logstash:

  cluster:

    - indices:admin/template/get

    - indices:admin/template/put

  indices:

    '*':

      '*':

        - indices:data/write/bulk

        - indices:data/write/bulk\[s\]

        - indices:data/write/delete

        - indices:data/write/update

        - indices:data/read/search

        - indices:data/read/scroll

        - CREATE_INDEX

 

We are now seeing this error, and the odd thing is the user=>nil password=>nil at the bottom of the error - like it is not getting passed. Any thoughts?

 

 

 

324 {:timestamp=>"2016-11-10T19:06:58.506000+0000", :message=>"Attempted to send a bulk request to Elasticsearch configured at '[\"https://es-m-qa-a.sapphirepri.com:9200/\"]', but an error occurred and it failed!     Are you sure you can reach elasticsearch from this machine using the configuration provided?", :error_message=>"[403] {\"error\":{\"root_cause\":[{\"type\":\"security_exception\",\"reason\":\"no permissions     for indices:data/write/bulk\"}],\"type\":\"security_exception\",\"reason\":\"no permissions for indices:data/write/bulk\"},\"status\":403}", :error_class=>"Elasticsearch::Transport::Transport::Errors::Forbidd    en", :backtrace=>["/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:201:in `__raise_transport_error'", "/opt/logstash/vendor/bundle/jruby    /1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/transport/base.rb:312:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/t    ransport/transport/http/manticore.rb:67:in `perform_request'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-transport-1.0.17/lib/elasticsearch/transport/client.rb:128:in `perform_request'", "/opt    /logstash/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.0.17/lib/elasticsearch/api/actions/bulk.rb:88:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/lo    gstash/outputs/elasticsearch/http_client.rb:53:in `non_threadsafe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb    :38:in `bulk'", "org/jruby/ext/thread/Mutex.java:149:in `synchronize'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/http_client.rb:3    8:in `bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:172:in `safe_bulk'", "/opt/logstash/vendor/bundle/jruby/1.9/gems    /logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:101:in `submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outpu    ts/elasticsearch/common.rb:86:in `retrying_submit'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:29:in `multi_receive'", "    org/jruby/RubyArray.java:1653:in `each_slice'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.0-java/lib/logstash/outputs/elasticsearch/common.rb:28:in `multi_receive'", "/opt/    logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/output_delegator.rb:130:in `worker_multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstas    h/output_delegator.rb:114:in `multi_receive'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "org/jruby/RubyHash.java:1342:in `each'", "    /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:301:in `output_batch'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.    rb:232:in `worker_loop'", "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-2.3.3-java/lib/logstash/pipeline.rb:201:in `start_workers'"], :client_config=>{:hosts=>["https://es-m-qa-a.sapphirepri.com:9    200/"], :ssl=>{:enabled=>true, :truststore_password=>"63GpQWZ8wVvq", :truststore=>"/etc/elasticsearch/truststore.jks"}, :transport_options=>{:socket_timeout=>0, :request_timeout=>0, :proxy=>nil, :ssl=>{:enabl    ed=>true, :truststore_password=>"63GpQWZ8wVvq", :truststore=>"/etc/elasticsearch/truststore.jks"}}, :transport_class=>Elasticsearch::Transport::Transport::HTTP::Manticore, :headers=>{"Authorization"=>"Basic b    G9nc3Rhc2g6c21keTN6NTluNWhr"}, :logger=>nil, :tracer=>nil, :reload_connections=>false, :retry_on_failure=>false, :reload_on_failure=>false, :randomize_hosts=>false, :http=>{:scheme=>"https", :user=>nil, :pass    word=>nil, :port=>9200}}, :level=>:error}

Ross Heilman

unread,
Nov 10, 2016, 3:14:10 PM11/10/16
to Search Guard
So we narrowed down the issue to one specific config - This is the index name we use in our Logstash config (and we need it to be this)  index => "qa-logs-%{loggerName}-%{+YYYY.MM.dd}"
Problem is Search-Guard gives us the permission errors when that is out index name. If we change it to something like -  index => "qa-logs-%{+YYYY.MM.dd}" Logstash with Search-Guard works fine.
Really stuck now - any help would be greatly appreciated!

Thanks!
-Ross

Jochen Kressin

unread,
Nov 10, 2016, 3:34:42 PM11/10/16
to Search Guard
Hi Ross,

thanks for reporting this, we're trying to reproduce it. Could you please help us by sending the Elasticsearch logfiles on DEBUG level when the error occurs?

Add this line to the logging.yml file in the config directory:

com.floragunn: DEBUG

Output will be verbose, but helps us to pinpoint the problem. If possible, just send the complete logfile from startup to the point where the "no perm match" error occurs.

Thanks!

Jochen
CTO
floragunn GmbH

Jochen Kressin

unread,
Nov 11, 2016, 3:58:17 AM11/11/16
to Search Guard
Also, are you sure that this part

%{loggerName}


gets expanded correctly? Because if it does and the outcome is a valid index name, and the logstash user is in group all_access, and this group has all permissions for all indices, there's  no reason it should not work.


Anyways, we need to have a look at the ES debug logs to see what's going on.

Ross Heilman

unread,
Nov 11, 2016, 9:30:49 AM11/11/16
to search...@googlegroups.com
Thank you! Yes, that is the weird thing, it seems as though it does get expanded out correctly when SG is not enabled, but it does not when it is enabled. I will post another error that shows %{type} not getting expanded out. Not sure search-guard has anything to do with it, but just a very odd issue. I had to role back the change at this moment. I need to rebuild it in a test environment and reproduce. Once I do that, I will send you guys the debug logs.

{:timestamp=>"2016-11-10T15:20:57.470000+0000", :message=>"[400] {\"error\":{\"root_cause\":[{\"type\":\"pattern_syntax_exception\",\"reason\":\"Illegal repetition near index 8\\nqa-logs-%{type}_2016\\\\.11\\    \\.08.*\\n        ^\"}],\"type\":\"pattern_syntax_exception\",\"reason\":\"Illegal repetition near index 8\\nqa-logs-%{type}_2016\\\\.11\\\\.08.*\\n        ^\"},\"status\":400}",

Thank you, and I'll get you more information as soon as I can,

- Ross

--
You received this message because you are subscribed to a topic in the Google Groups "Search Guard" group.
To unsubscribe from this topic, visit https://groups.google.com/d/topic/search-guard/vMOPYG2TUmY/unsubscribe.
To unsubscribe from this group and all its topics, send an email to search-guard+unsubscribe@googlegroups.com.

To post to this group, send email to search...@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages