Adding a node, Elasticsearch and Logstash down in GUI

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
polarbear1
Posts: 73
Joined: Mon Apr 13, 2015 4:26 pm

Re: Adding a node, Elasticsearch and Logstash down in GUI

Post by polarbear1 »

cat /var/log/elasticsearch/*.log
Node 1 looks like search queries from the Dashboard. Node 2 has no activity for today, most recent chunk is here:

Code: Select all

015-07-14 13:54:48,797][INFO ][node                     ] [33ff6054-696c-48f0-8155-1917aff9d8d1] started
[2015-07-14 13:54:48,823][INFO ][gateway                  ] [33ff6054-696c-48f0-8155-1917aff9d8d1] recovered [0] indices into cluster_state
[2015-07-14 13:55:11,295][INFO ][node                     ] [33ff6054-696c-48f0-8155-1917aff9d8d1] stopping ...
[2015-07-14 13:55:11,324][INFO ][node                     ] [33ff6054-696c-48f0-8155-1917aff9d8d1] stopped
[2015-07-14 13:55:11,324][INFO ][node                     ] [33ff6054-696c-48f0-8155-1917aff9d8d1] closing ...
[2015-07-14 13:55:11,331][INFO ][node                     ] [33ff6054-696c-48f0-8155-1917aff9d8d1] closed
[2015-07-14 10:49:46,009][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] version[1.3.2], pid[5447], build[dee175d/2014-08-13T14:29:30Z]
[2015-07-14 10:49:46,014][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] initializing ...
[2015-07-14 10:49:46,033][INFO ][plugins                  ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] loaded [knapsack-1.3.2.0-d5501ef], sites []
[2015-07-14 10:49:52,472][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] initialized
[2015-07-14 10:49:52,472][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] starting ...
[2015-07-14 10:49:53,237][INFO ][transport                ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] bound_address {inet[/0:0:0:0:0:0:0:0:9300]}, publish_address {inet[/192.168.1.249:9300]}
[2015-07-14 10:49:53,341][INFO ][discovery                ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] e8945dd0-ae36-4699-a0fc-43811a9c38e1/2p8j0OlRQ8uXjuOsn-FAzA
[2015-07-14 10:49:56,440][INFO ][cluster.service          ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] new_master [4bee07f8-6f40-451a-a5bb-666e9a22b387][2p8j0OlRQ8uXjuOsn-FAzA][schpnag2][inet[/192.168.1.249:9300]]{max_local_storage_nodes=1}, reason: zen-disco-join (elected_as_master)
[2015-07-14 10:49:56,468][INFO ][http                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] bound_address {inet[/127.0.0.1:9200]}, publish_address {inet[localhost/127.0.0.1:9200]}
[2015-07-14 10:49:56,468][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] started
[2015-07-14 10:49:56,494][INFO ][gateway                  ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] recovered [0] indices into cluster_state
[2015-07-14 10:50:14,091][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] stopping ...
[2015-07-14 10:50:14,117][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] stopped
[2015-07-14 10:50:14,117][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] closing ...
[2015-07-14 10:50:14,126][INFO ][node                     ] [4bee07f8-6f40-451a-a5bb-666e9a22b387] closed
cat /var/log/logstash/logstash.log
Node 1 not particularly interesting, seeing mostly my messages. Node 2 we have a repeating error...

Code: Select all

{:timestamp=>"2015-07-15T10:32:03.694000-0500", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>org.elasticsearch.client.transport.NoNodeAvailableException: No node available, :backtrace=>["org.elasticsearch.client.transport.TransportClientNodesService.execute(org/elasticsearch/client/transport/TransportClientNodesService.java:219)", "org.elasticsearch.client.transport.support.InternalTransportClient.execute(org/elasticsearch/client/transport/support/InternalTransportClient.java:106)", "org.elasticsearch.client.support.AbstractClient.bulk(org/elasticsearch/client/support/AbstractClient.java:147)", "org.elasticsearch.client.transport.TransportClient.bulk(org/elasticsearch/client/transport/TransportClient.java:360)", "org.elasticsearch.action.bulk.BulkRequestBuilder.doExecute(org/elasticsearch/action/bulk/BulkRequestBuilder.java:165)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:85)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:59)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)", "RUBY.bulk(/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:207)", "RUBY.flush(/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:315)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1339)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:193)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:193)", "RUBY.buffer_receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:159)", "RUBY.receive(/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:311)", "RUBY.handle(/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:86)", "RUBY.worker_setup(/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:78)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:warn}
tail -n20 /var/log/httpd/error_log
Nothing, at least for today. Yesterday on node 2 there was the NoNodeAvailableException which I pointed out in OP. Fairly certain that's one of the things we fixed already in the config.
tail -n20 /var/log/httpd/access_log
Nothing exciting on either node. Just standard User Agent strings from my workstation, which makes sense.
tail -f /usr/local/nagioslogserver/var/jobs.log
tail -f /usr/local/nagioslogserver/var/poller.log
Nothing exciting. Only exists on node 1. Does not exist on node 2.
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: Adding a node, Elasticsearch and Logstash down in GUI

Post by jolson »

Nothing exciting. Only exists on node 1. Does not exist on node 2.
Now that's interesting. Those files should certainly exist. Is crond running on node 2? Any errors in the cron log?

Code: Select all

service crond status
cat /var/log/cron
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
polarbear1
Posts: 73
Joined: Mon Apr 13, 2015 4:26 pm

Re: Adding a node, Elasticsearch and Logstash down in GUI

Post by polarbear1 »

DING DING DING! Winner!

Cron didn't run because there was no /home/nagios for some reason. Was probably an accidental delete on my part because I was screwing with directories yesterday and may have fat fingered something. So creating /home/nagios and giving changing owner/group to user nagios and a quick reboot and now we ahve the jobs and poller log files. And the dashboard is now happy.

Case Closed. Thanks. :D
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: Adding a node, Elasticsearch and Logstash down in GUI

Post by jolson »

polarbear1,

:D I'm glad we got this resolved. Of course if you have any further issues feel free to make further threads!

Jesse
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
Locked