New install Logstash failing

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
User avatar
BanditBBS
Posts: 2474
Joined: Tue May 31, 2011 12:57 pm
Location: Scio, OH
Contact:

New install Logstash failing

Post by BanditBBS »

Got a 2nd cluster spun up and was getting ready to start sending logs and noticed on one of the boxes the logstash was down. I restarted it and within 10 minutes it was down again.
2 of XI5.6.14 Prod/DR/DEV - Nagios LogServer 2 Nodes
See my projects on the Exchange at BanditBBS - Also check out my Nagios stuff on my personal page at Bandit's Home and at github
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: New install Logstash failing

Post by mcapra »

Anything interesting in /var/log/logstash/logstash.log on the node that is failing?
Former Nagios employee
https://www.mcapra.com/
User avatar
BanditBBS
Posts: 2474
Joined: Tue May 31, 2011 12:57 pm
Location: Scio, OH
Contact:

Re: New install Logstash failing

Post by BanditBBS »

Still looking, but saw this:

Code: Select all

{:timestamp=>"2016-09-21T08:08:02.757000-0700", :message=>"Got error to send bulk of actions: None of the configured nodes are available: []", :level=>:error}
{:timestamp=>"2016-09-21T08:08:02.764000-0700", :message=>"Got error to send bulk of actions: None of the configured nodes are available: []", :level=>:error}
{:timestamp=>"2016-09-21T08:08:02.765000-0700", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [], :backtrace=>["org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(org/elasticsearch/client/transport/TransportClientNodesService.java:279)", "org.elasticsearch.client.transport.TransportClientNodesService.execute(org/elasticsearch/client/transport/TransportClientNodesService.java:198)", "org.elasticsearch.client.transport.support.InternalTransportClient.execute(org/elasticsearch/client/transport/support/InternalTransportClient.java:106)", "org.elasticsearch.client.support.AbstractClient.bulk(org/elasticsearch/client/support/AbstractClient.java:163)", "org.elasticsearch.client.transport.TransportClient.bulk(org/elasticsearch/client/transport/TransportClient.java:356)", "org.elasticsearch.action.bulk.BulkRequestBuilder.doExecute(org/elasticsearch/action/bulk/BulkRequestBuilder.java:164)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:91)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:65)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/protocol.rb:224)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/protocol.rb:224)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:466)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:466)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:465)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:465)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:490)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:490)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:489)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:489)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1341)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193)", "Stud::Buffer.buffer_receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:159)", "Stud::Buffer.buffer_receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:159)", "LogStash::Outputs::ElasticSearch.receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:455)", "LogStash::Outputs::ElasticSearch.receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:455)", "LogStash::Outputs::Base.handle(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:88)", "LogStash::Outputs::Base.handle(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:88)", "RUBY.worker_setup(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:79)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:warn}
{:timestamp=>"2016-09-21T08:08:02.770000-0700", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>org.elasticsearch.client.transport.NoNodeAvailableException: None of the configured nodes are available: [], :backtrace=>["org.elasticsearch.client.transport.TransportClientNodesService.ensureNodesAreAvailable(org/elasticsearch/client/transport/TransportClientNodesService.java:279)", "org.elasticsearch.client.transport.TransportClientNodesService.execute(org/elasticsearch/client/transport/TransportClientNodesService.java:198)", "org.elasticsearch.client.transport.support.InternalTransportClient.execute(org/elasticsearch/client/transport/support/InternalTransportClient.java:106)", "org.elasticsearch.client.support.AbstractClient.bulk(org/elasticsearch/client/support/AbstractClient.java:163)", "org.elasticsearch.client.transport.TransportClient.bulk(org/elasticsearch/client/transport/TransportClient.java:356)", "org.elasticsearch.action.bulk.BulkRequestBuilder.doExecute(org/elasticsearch/action/bulk/BulkRequestBuilder.java:164)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:91)", "org.elasticsearch.action.ActionRequestBuilder.execute(org/elasticsearch/action/ActionRequestBuilder.java:65)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:606)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/protocol.rb:224)", "LogStash::Outputs::Elasticsearch::Protocols::NodeClient.bulk(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch/protocol.rb:224)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:466)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:466)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:465)", "LogStash::Outputs::ElasticSearch.submit(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:465)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:490)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:490)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:489)", "LogStash::Outputs::ElasticSearch.flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:489)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:219)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1341)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:216)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193)", "Stud::Buffer.buffer_flush(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:193)", "Stud::Buffer.buffer_receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:159)", "Stud::Buffer.buffer_receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.19/lib/stud/buffer.rb:159)", "LogStash::Outputs::ElasticSearch.receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:455)", "LogStash::Outputs::ElasticSearch.receive(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-0.2.8-java/lib/logstash/outputs/elasticsearch.rb:455)", "LogStash::Outputs::Base.handle(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:88)", "LogStash::Outputs::Base.handle(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:88)", "RUBY.worker_setup(/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/outputs/base.rb:79)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:warn}[root@hdfs-365-nag01 nagioslogserver]#
2 of XI5.6.14 Prod/DR/DEV - Nagios LogServer 2 Nodes
See my projects on the Exchange at BanditBBS - Also check out my Nagios stuff on my personal page at Bandit's Home and at github
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: New install Logstash failing

Post by mcapra »

Hmm NoNodeAvailableException implies that logstash isn't able to reach the elasticsearch cluster properly.

I would also check the elasticsearch logs in /var/log/elasticsearch for potential consensus issues. Last time I saw that particular exception it was due to a master not being elected.
Former Nagios employee
https://www.mcapra.com/
User avatar
BanditBBS
Posts: 2474
Joined: Tue May 31, 2011 12:57 pm
Location: Scio, OH
Contact:

Re: New install Logstash failing

Post by BanditBBS »

Dangit, ignore that previous paste of log. I was moving the data storage location so had elasticsearch down for a few minutes. Other than those errors while I was doing that, I see nothing else in the log.
2 of XI5.6.14 Prod/DR/DEV - Nagios LogServer 2 Nodes
See my projects on the Exchange at BanditBBS - Also check out my Nagios stuff on my personal page at Bandit's Home and at github
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: New install Logstash failing

Post by mcapra »

Try running logstash like so:

Code: Select all

/usr/local/nagioslogserver/logstash/bin/logstash -f /usr/local/nagioslogserver/logstash/etc/conf.d --debug > /tmp/logstash_debug.log
And continue to monitor until it crashes. When it crashes, share the logstash_debug.log that gets created. It'll likely be an awfully long and poorly formatted file.
Former Nagios employee
https://www.mcapra.com/
User avatar
BanditBBS
Posts: 2474
Joined: Tue May 31, 2011 12:57 pm
Location: Scio, OH
Contact:

Re: New install Logstash failing

Post by BanditBBS »

You mean stop the service and then run that command? Doing that NLS thinks logstash isn't up, is that right?

Also, before I did it, got this after a crash:

Code: Select all

{:timestamp=>"2016-09-21T10:41:18.266000-0700", :message=>"Got error to send bulk of actions: Failed to deserialize exception response from stream", :level=>:error}
{:timestamp=>"2016-09-21T10:41:18.267000-0700", :message=>"Got error to send bulk of actions: Failed to deserialize exception response from stream", :level=>:error}
{:timestamp=>"2016-09-21T10:41:18.268000-0700", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>org.elasticsearch.transport.TransportSerializationException: Failed to deserialize exception response from stream, :backtrace=>["org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(org/elasticsearch/transport/netty/MessageChannelHandler.java:176)", "org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(org/elasticsearch/transport/netty/MessageChannelHandler.java:128)", "org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(org/elasticsearch/common/netty/channel/SimpleChannelUpstreamHandler.java:70)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:564)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:791)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:296)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:462)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:443)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:303)", "org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(org/elasticsearch/common/netty/channel/SimpleChannelUpstreamHandler.java:70)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:564)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:559)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:268)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:255)", "org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(org/elasticsearch/common/netty/channel/socket/nio/NioWorker.java:88)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioWorker.java:108)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioSelector.java:337)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioWorker.java:89)", "org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(org/elasticsearch/common/netty/channel/socket/nio/NioWorker.java:178)", "org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(org/elasticsearch/common/netty/util/ThreadRenamingRunnable.java:108)", "org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(org/elasticsearch/common/netty/util/internal/DeadLockProofWorker.java:42)", "java.util.concurrent.ThreadPoolExecutor.runWorker(java/util/concurrent/ThreadPoolExecutor.java:1145)", "java.util.concurrent.ThreadPoolExecutor$Worker.run(java/util/concurrent/ThreadPoolExecutor.java:615)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:warn}
{:timestamp=>"2016-09-21T10:41:18.268000-0700", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>org.elasticsearch.transport.TransportSerializationException: Failed to deserialize exception response from stream, :backtrace=>["org.elasticsearch.transport.netty.MessageChannelHandler.handlerResponseError(org/elasticsearch/transport/netty/MessageChannelHandler.java:176)", "org.elasticsearch.transport.netty.MessageChannelHandler.messageReceived(org/elasticsearch/transport/netty/MessageChannelHandler.java:128)", "org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(org/elasticsearch/common/netty/channel/SimpleChannelUpstreamHandler.java:70)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:564)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline$DefaultChannelHandlerContext.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:791)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:296)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.unfoldAndFireMessageReceived(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:462)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.callDecode(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:443)", "org.elasticsearch.common.netty.handler.codec.frame.FrameDecoder.messageReceived(org/elasticsearch/common/netty/handler/codec/frame/FrameDecoder.java:303)", "org.elasticsearch.common.netty.channel.SimpleChannelUpstreamHandler.handleUpstream(org/elasticsearch/common/netty/channel/SimpleChannelUpstreamHandler.java:70)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:564)", "org.elasticsearch.common.netty.channel.DefaultChannelPipeline.sendUpstream(org/elasticsearch/common/netty/channel/DefaultChannelPipeline.java:559)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:268)", "org.elasticsearch.common.netty.channel.Channels.fireMessageReceived(org/elasticsearch/common/netty/channel/Channels.java:255)", "org.elasticsearch.common.netty.channel.socket.nio.NioWorker.read(org/elasticsearch/common/netty/channel/socket/nio/NioWorker.java:88)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.process(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioWorker.java:108)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioSelector.run(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioSelector.java:337)", "org.elasticsearch.common.netty.channel.socket.nio.AbstractNioWorker.run(org/elasticsearch/common/netty/channel/socket/nio/AbstractNioWorker.java:89)", "org.elasticsearch.common.netty.channel.socket.nio.NioWorker.run(org/elasticsearch/common/netty/channel/socket/nio/NioWorker.java:178)", "org.elasticsearch.common.netty.util.ThreadRenamingRunnable.run(org/elasticsearch/common/netty/util/ThreadRenamingRunnable.java:108)", "org.elasticsearch.common.netty.util.internal.DeadLockProofWorker$1.run(org/elasticsearch/common/netty/util/internal/DeadLockProofWorker.java:42)", "java.util.concurrent.ThreadPoolExecutor.runWorker(java/util/concurrent/ThreadPoolExecutor.java:1145)", "java.util.concurrent.ThreadPoolExecutor$Worker.run(java/util/concurrent/ThreadPoolExecutor.java:615)", "java.lang.Thread.run(java/lang/Thread.java:745)"], :level=>:warn}
{:timestamp=>"2016-09-21T11:27:13.192000-0700", :message=>"SIGTERM received. Shutting down the pipeline.", :level=>:warn}
2 of XI5.6.14 Prod/DR/DEV - Nagios LogServer 2 Nodes
See my projects on the Exchange at BanditBBS - Also check out my Nagios stuff on my personal page at Bandit's Home and at github
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: New install Logstash failing

Post by mcapra »

BanditBBS wrote:You mean stop the service and then run that command? Doing that NLS thinks logstash isn't up, is that right?
Yes to the first one, and "I believe so" on the second one.

TransportSerializationException is an interesting exception to catch. Before we start capturing big ugly debug output, can you share the java versions on each node? java -version should do the trick.
Former Nagios employee
https://www.mcapra.com/
User avatar
BanditBBS
Posts: 2474
Joined: Tue May 31, 2011 12:57 pm
Location: Scio, OH
Contact:

Re: New install Logstash failing

Post by BanditBBS »

mcapra wrote:
BanditBBS wrote:You mean stop the service and then run that command? Doing that NLS thinks logstash isn't up, is that right?
Yes to the first one, and "I believe so" on the second one.

TransportSerializationException is an interesting exception to catch. Before we start capturing big ugly debug output, can you share the java versions on each node? java -version should do the trick.
Here ya go....

Code: Select all

[jclark@hdfs-365-nag01 ~]$ java -version
java version "1.7.0_111"
OpenJDK Runtime Environment (rhel-2.6.7.2.el6_8-x86_64 u111-b01)
OpenJDK 64-Bit Server VM (build 24.111-b01, mixed mode)
I have it capturing the big ugly log on the bad node.
2 of XI5.6.14 Prod/DR/DEV - Nagios LogServer 2 Nodes
See my projects on the Exchange at BanditBBS - Also check out my Nagios stuff on my personal page at Bandit's Home and at github
User avatar
mcapra
Posts: 3739
Joined: Thu May 05, 2016 3:54 pm

Re: New install Logstash failing

Post by mcapra »

Cool, send it on over if/when it crashes. Can I also see the java -version output on the other node that is functional?
Former Nagios employee
https://www.mcapra.com/
Locked