NLS Log from file not working

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

NLS Log from file not working

Post by kconti »

Hi all,

I'm fairly new, and hope someone can point me in the right direction here. We have a system that we don't have direct access to, but we do have access to daily application logs for that server. We are hoping to have NLS look at that file so we can report and alert on major events. However, it does not seem to read the file. I have double-checked our firewall and nothing is blocking the access between the two systems. In fact, we are already getting syslog from that same server already.

According to the +Log Source tab, this setup for a file is very simple:
bash setup-linux.sh -s 192.168.2.108 -p 5544 -f /PATH/TO/FILE/LOG_NAME.csv -t LOG_TAG ####Yes, this is the IP and port we are using

From this example you can see we are trying to have it send info from a csv file. I have also tried ".log"

I go to the NLS dashboard for the source that is already working (with syslog) and edit it so it isn't restricted to "syslog". So it is now showing all events, yet the application logs are not showing up.

Do I need to add an additional input or filter in Administration -> Global Configuration ? Right now I am using the defaults... Inputs: Syslog, Windows Event Log, Import Files (RAW and JSON). Filters - Apache
If I need an additional input or filter, could someone point me to the right information to help me out with this?

Thanks,
KC
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

Are you sure that the csv data is reaching Nagios Log Server? You can check with tcpdump.

Code: Select all

tcpdump -n dst port 5544
You will likely need to use the 'csv' filter: http://logstash.net/docs/1.4.2/filters/csv

Let me know if this helps. Thanks!
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

Hi jolson,

Thanks for the reply. I am getting other traffic on the tcpdump due to the other syslog info coming from that server, so its hard to tell.. I'm not exactly sure how to force the poll to go through, however I have edited the conf file that gets generated and changed "InputFilePollInterval 10" to "InputFilePollInterval 1"...assuming this changes the poll from 10 minutes to 1 minute? I may be wrong here. However, I don't see anything being polled consistently, it all seems sort of random.

I have attempted to add a quick filter, something like this:
"
filter {
csv {
add_field => Access
add_tag => Access_Log
columns => "Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"
}
}
"
However, when I verify the configuration, I get an error saying it is looking for a "#" after csv.

The documentation for this tool really seems to be lacking detail and good examples. I'd just like to read in data from a CSV with the columns above. I haven't found any examples out there, nor have I had any training in this, so I may be in over my head here.
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

I think that the following might help you out: https://kevinkirsche.com/2014/08/25/usi ... ticsearch/

Since your input is over tcp/udp, we won't have to worry about the way he's inputting files. One thing that he defined in his filter is the 'separator':

Code: Select all

filter {  
    csv {
        columns => ["@timestamp", "interface", "bytes in", "bytes out"]
        separator => ","
    }
}
Let us know if this works for you. Thanks!
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

jolson - thank you for your support with this.

Do you recommend applying these filters via command line in a particular conf file? Through the GUI doesn't seem to be working out well.

When adding a second filter (below Apache filter), I get the following error:

"Error: Expected one of #, => at line 82, column 13 (byte 1635) after filter {
if [program] == 'apache_access' {
grok {
match => [ 'message', '%{COMBINEDAPACHELOG}']
}
date {
match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]
}
mutate {
replace => [ 'type', 'apache_access' ]
convert => [ 'bytes', 'integer' ]
convert => [ 'response', 'integer' ]
}
}

if [program] == 'apache_error' {
grok {
match => [ 'message', '\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\] \[%{WORD:class}\] \[%{WORD:originator} %{IP:clientip}\] %{GREEDYDATA:errmsg}']
}
mutate {
replace => [ 'type', 'apache_error' ]
}
}
filter {
csv"

I believe error is stating I'm missing a "," at line 82...yet between both filters there are a lot less lines than that. It looks like it is trying to combine part of the CSV filter with the apache filter although they are separate. "filter{ csv" does not exist the in the apache filter.
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

The webGUI is where we will need to define all configurations. It looks like the problem we're running into is that through the webGUI, the 'filter' variable is implied already - so adding

Code: Select all

filter { 
    csv {
        columns => ["@timestamp", "interface", "bytes in", "bytes out"]
        separator => ","
    }
}
Will throw an error. Instead, we need to add:

Code: Select all

csv {
    columns => ["@timestamp", "interface", "bytes in", "bytes out"]
    separator => ","
}
You can check your results after pressing 'Apply Configuration' by running the following on your CLI:

Code: Select all

cat /usr/local/nagioslogserver/logstash/etc/conf.d/*
Please return that output to me. Thank you!
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

It accepted that configuration - thanks for spotting that. I'm still unable to search the contents after the configurations have been applied.

Here is the output you requested:
cat /usr/local/nagioslogserver/logstash/etc/conf.d/*
#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Wed, 08 Apr 2015 11:19:40 -0400
#

#
# Global inputs
#

input {
syslog {
type => 'syslog'
port => 5544
}
tcp {
type => 'eventlog'
port => 3515
codec => json {
charset => 'CP1252'
}
}
tcp {
type => 'import_raw'
tags => 'import_raw'
port => 2056
}
tcp {
type => 'import_json'
tags => 'import_json'
port => 2057
codec => json
}
}

#
# Local inputs
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Wed, 08 Apr 2015 11:19:40 -0400
#

#
# Global filters
#

filter {
if [program] == 'apache_access' {
grok {
match => [ 'message', '%{COMBINEDAPACHELOG}']
}
date {
match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]
}
mutate {
replace => [ 'type', 'apache_access' ]
convert => [ 'bytes', 'integer' ]
convert => [ 'response', 'integer' ]
}
}

if [program] == 'apache_error' {
grok {
match => [ 'message', '\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\] \[%{WORD:class}\] \[%{WORD:originator} %{IP:clientip}\] %{GREEDYDATA:errmsg}']
}
mutate {
replace => [ 'type', 'apache_error' ]
}
}
csv {
columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
separator => ","
}
}

#
# Local filters
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Wed, 08 Apr 2015 11:19:40 -0400
#

#
# Required output for Nagios Log Server
#

output {
elasticsearch {
cluster => '199525a6-0502-414f-8d1f-5a3d5e7fd90e'
host => 'localhost'
index_type => '%{type}'
node_name => '0f8f3f1d-7049-4bb0-b1d0-c1ad93b958c8'
protocol => 'transport'
workers => 4
}
}

#
# Global outputs
#



#
# Local outputs
#
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

The columns are now coming up as "fields" since I applied that configuration, which is great...that part of the configuration is working. Just need to figure out why the data isn't quite making it to the search.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

Correction: Only some of the fields are showing up. If I do a search for results just with those fields, it looks like some events are getting parsed into the new field. For example - one of my columns is "Partition". Event search for all results with Partition gives me the information in field "message".

I created the tag "Access_Log" for so if this was working it would show up with that name in the "program" field. It isn't listed. I'm only getting the following
sudo 7720
crond 1166
cron 558
postfix/smtp 290
postfix/qmgr 152
kernel 17
ssh 15
script 11
ntpd 10
rsyslogd 7
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

Can you please post a small example .csv file? I would like to test this in my lab.

Another thing to note: Your .csv file is currently being sent to the 'syslog' input. The nature of this input is that it will parse your .csv file using the built-in 'syslog' filter, which could mess with your data. The best route to take would be to define a separate input for your .csv files - an input that does not apply filters. For example, if your logs are being sent via tcp:

Code: Select all

tcp {
type => 'csvinput'
port => 9001
}

if [type] == 'csvinput' {
    csv {
        columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
        separator => ","
    }
}
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
Locked