Page 2 of 3

Re: LogServer's memory be exhausted

Posted: Thu May 12, 2016 9:18 am
by eloyd
Maybe not. Remember - the quest is not for lots of free memory, it's for lots of memory being used efficiently. The JVMs may have a lot of memory (or CPU) allocated because they're doing useful work. It's like buying a 3000 square foot house then only using 1000 square feet and saving the other 2000 square feet "in case you need it." Either buy a 1000 square foot house and use it all or buy a 3000 square foot house and use it all. But in both cases, use it efficiently.

Re: LogServer's memory be exhausted

Posted: Thu May 12, 2016 9:54 am
by hsmith
Let us know what happens.

Re: LogServer's memory be exhausted

Posted: Thu May 19, 2016 4:34 am
by bennspectrum
Hello all,

According to this article, I want to ask something about the setup of the maintain & backup page.

If I want to keep the indexes 60 days or more than 60, any other configuration should I setup?

If I have a 8 cores cpu, 64GB memory machine, what is the loading limit could NLS afford? How much is the data volume one day?20GB or more?

I mean the NLS tolerance, have any suggestions or reference data?

Thanks.

Re: LogServer's memory be exhausted

Posted: Thu May 19, 2016 1:49 pm
by hsmith
bennspectrum wrote:If I want to keep the indexes 60 days or more than 60, any other configuration should I setup?
Open, or on the server?
bennspectrum wrote:If I have a 8 cores cpu, 64GB memory machine, what is the loading limit could NLS afford? How much is the data volume one day?20GB or more?
I've seen 40+GB per day work like this. YMMV depending on hardware.
bennspectrum wrote:I mean the NLS tolerance, have any suggestions or reference data?
I unfortunately don't have any best practices for configuration of how long to keep things open. I'd be happy to help with specific questions though.

Re: LogServer's memory be exhausted

Posted: Thu May 19, 2016 1:54 pm
by eloyd
Honestly, there is no best practices for how long to keep things open, since it all depends on what you need to do with your data. However, if you come to the Nagios 2016 World Conference, you can watch one of our consultants do a presentation on that very topic!

Details at https://conference.nagios.com/speakers/#Sean-Falzon

Re: LogServer's memory be exhausted

Posted: Thu May 19, 2016 4:25 pm
by hsmith
I agree with the NWC 2016 plug, it's a great time!

Re: LogServer's memory be exhausted

Posted: Mon May 23, 2016 10:04 pm
by bennspectrum
Thank @hsmith and @eloyd
Open, or on the server?
I want it Open,hope that the data could keep 60, 70 ..., even more days, so I can query them conveniently.

Re: LogServer's memory be exhausted

Posted: Tue May 24, 2016 9:40 am
by rkennedy
There isn't a great way to estimate how much memory usage NLS will need, but to keep all 60 days worth of logs open is going to rather difficult.

At 20GB/day of logs your cache will easily be overloaded after 1-2 weeks, for something like 60-70 days you would need a cluster to handle the load and would need an immense amount of ram.

I recommend opening the indexes per day as you need to, and drill down that way. This will allow you to conserve your memory, and avoid having to build a huge cluster to handle it all.

Re: LogServer's memory be exhausted

Posted: Tue May 24, 2016 9:55 am
by eloyd
@rk is right - 60 days at dozens of gigs per day is going to be a BIG set of indexes, even if it's distributed. I always encourage our clients to examine what the goal is. Do you really need 60 days worth of search capability? That's a terrabyte of information you'll be searching at 20GB/day for 60 days. That's a LOT of data. Instead, ask can I just search recent data for trends, alert on those trends, and if I need to, open up past data to get more information?

In the end, however, the answer to your question is, "try it." It may work for you, it may not.

Re: LogServer's memory be exhausted

Posted: Tue May 24, 2016 4:10 pm
by rkennedy
Thanks @eloyd!

@bennspectrum - let us know if you have any further questions.