Page 1 of 1

Logstash keeps crashing random

Posted: Sun Oct 06, 2019 9:09 pm
by saber
Hi,

Logstash keeps crashing every 2 weeks, sometimes more or less.

We already increased "LS_HEAP_SIZE" to a big amount. Process simply disappears and nothing is logged.

Any idea?

Thank you,
Saber

Re: Logstash keeps crashing random

Posted: Mon Oct 07, 2019 9:47 am
by mbellerue
What version of Log Server is this? Also, currently what is your LS_HEAP_SIZE and LS_OPEN_FILES set to? How much memory does your system have?

Re: Logstash keeps crashing random

Posted: Sat Oct 26, 2019 6:49 pm
by saber
It's always related to "LS_HEAP_SIZE" . We upgraded it to 64GB.. but still having the following

Code: Select all

Oct 24 17:06:15  logstash: Error: Your application used more memory than the safety cap of 65536M.
Oct 24 17:06:15  logstash: Specify -J-Xmx####m to increase it (#### = cap size in MB).
Oct 24 17:06:15  logstash: Specify -w for full OutOfMemoryError stack trace
There must a be a leak somewhere..

Re: Logstash keeps crashing random

Posted: Mon Oct 28, 2019 9:23 am
by scottwilkerson
What is the output of this command

Code: Select all

grep HEAP /etc/sysconfig/logstash

Re: Logstash keeps crashing random

Posted: Mon Oct 28, 2019 11:16 am
by saber

Code: Select all

#LS_HEAP_SIZE="256m"
It's increased under "/etc/init.d/logstash" to 65536m

Re: Logstash keeps crashing random

Posted: Mon Oct 28, 2019 11:18 am
by scottwilkerson
saber wrote:

Code: Select all

#LS_HEAP_SIZE="256m"
It's increased under "/etc/init.d/logstash" to 65536m
That is definitely a problem, you have it set to a level equal to the whole system memory, it needs to be MUCH lower, I would think about 2048m would be a good value to set it to.

Elasticsearch is going to use 32G and you also need a good amount of free available for cached memory to enable elasticsearch to function properly

Re: Logstash keeps crashing random

Posted: Tue Oct 29, 2019 1:28 pm
by saber
We have 256GB of RAM which 128GB is assigned to elastic search (50%..) and 64GB to logstash..

Re: Logstash keeps crashing random

Posted: Tue Oct 29, 2019 2:50 pm
by cdienger
64GB is the max we recommend for a system since after 32GB(Elasticsearch takes half automatically), java will see performance issues with memory addressing. I'd recommend setting the heap size manually by editing /etc/sysconfig/elasticsearch and commenting out this line:

Code: Select all

#ES_HEAP_SIZE=$(expr $(free -m|awk '/^Mem:/{print $2}') / 2 )m
and replacing it with something like:

Code: Select all

ES_HEAP_SIZE=3100m
I would also lower the LS_HEAP_SIZE size to something lower than 32GB.

Is this a Cent/RHEL or Ubuntu/Debian install?

Re: Logstash keeps crashing random

Posted: Thu Oct 31, 2019 2:45 pm
by saber
Hi,

Total server memory is 256GB. If Elastic Search takes 50%, it's 128GB.

We have 128GB remaining. We assigned 64GB to Logstash because of those errors but no help.

It's a CentOS 7 install. We use syslog over tls.

I have attached our memory usage over 30 days. Clearly, there are no OOMs at all. It's always a logstash failure for unknown reasons..

Re: Logstash keeps crashing random

Posted: Thu Oct 31, 2019 3:19 pm
by cdienger
128GB is four times what is recommended for Elasticsearch. Anything above 32 can cause performance problems:

https://www.elastic.co/blog/a-heap-of-trouble

I would recommend lowering the memory of logstash as well for the same reason. Ordinarily it shouldn't take anywhere near as much memory as it supposedly using. It may be a configuration issue - please PM me a profile from the machine after lowering the memory values for both services and restarting them.

A profile can be gathered under Admin > System > System Status > Download System Profile or from the command line with:

/usr/local/nagioslogserver/scripts/profile.sh

This will create /tmp/system-profile.tar.gz.

Note that this file can be very large and may not be able to be uploaded through a PM. This is usually due to the logs in the Logstash and/or Elasticsearch directories found in it. If it is too large, please open the profile, extract these directories/files and send them separately.