NLS Log from file not working

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

I have added the sample log (same log but cut down a lot and replaced sensitive info)

Also, I tried your input, saved it, and applied the configuration. I get the message: "Logstash is currently collecting locally on: 192.168.2.108 tcp: 2056, 5544, 2057, 3515udp: 5544 "
It doesn't mention the port 9001 that you had put in the input. Do I need to use one of the inputs above? I believe all our other syslogs are using 5544.

The configuration took but results are the same.

I really do appreciate the help. Trying this out yourself should at least point out if this is operator error or not =)
You do not have the required permissions to view the files attached to this post.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

I'm assuming I'll need to define an "output" since we created the separate input?
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

I have tested this in my lab - with your current setup, are you seeing similar results to what I am seeing? See attachment:
Capture.PNG
My inputs/filters:

Code: Select all

tcp {
type => 'csvinput'
port => 9001
}

if [type] == 'csvinput' {
    csv {
        columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
        separator => ","
    }
}
It looks like the message is being parsed properly, as you can see by my attachment. Now, we need to add more to the filter to parse your fields appropriately. I drafted up a filter that you can test that works in my lab.



So, we have the 'message' field looking like this:
<133>Apr 8 11:50:10 nagioscore csvtag: Master,,04/08/15 6:42,04/08/15 6:42,Access granted,Doe,Kristen,550000002BD2B127,2nd Fl S2 Node,2ND FL OPEN OFFICE AREA,2ND FL - OPEN OFFICE AREA,201

Let's throw some regex together to parse it properly (I like to use http://grokdebug.herokuapp.com/ to help me with these). Please keep in mind that I came up with this filter very quickly and it will need to be adjusted and tuned by you.

Code: Select all

^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{GREEDYDATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$
Let's add the defined regex to our filter...

Code: Select all

    if [type] == 'csvinput' {
        csv {
            columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
            separator => ","
        }
      grok {
        match => [ "message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$" ]
      }
    }
Which results in what we see in the second attachment:
Capture2.PNG
As you can see, the fields are (mostly) parsed out. Like I said, it will need a little bit of work, but I hope that this can serve as a good example.

Thanks!


Jesse
You do not have the required permissions to view the files attached to this post.
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

That looks good on your end! Very nice.

Using your input/filters, do I also need to do the following on the remote source?
bash setup-linux.sh -s 192.168.2.108 -p 9001 -f /var/log/S2_backup/Access_Log.csv -t Access_Log

9001 instead of 5544?
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

I tried it and the only "type" I see still is syslog. I did a search for everything in the past 7 days ...all 10000 events were syslog.

I'm clearly missing a step.
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

That is correct - you will need to start shipping the .csv to port 9001. Be sure to open tcp port 9001 on your Nagios Log Server. You can open the port in iptables using the following command:

Code: Select all

iptables -A INPUT -p tcp --dport 9001 -j ACCEPT
It's worth noting that you can use any port you want - I chose 9001 arbitrarily. You will need to save your iptables config afterward:

Code: Select all

service iptables save
After opening the port, to re-send the .csv file, simply either wait for a change to the file, or make a change yourself (I replaced a ',' with a ','). If you still see no data, restart rsyslog on your host:

Code: Select all

service rsyslog restart
Let me know if this gets you started. Thanks!
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

Hi, made the following changes, but still no luck:

Stopped IPTABLES temporarily:
/etc/init.d/iptables stop

Verified log folder and log file is accessible with permissions - /var/log/S2_backup/Access_log.csv

Ran:
bash setup-linux.sh -s 192.168.2.108 -p 9001 -f /var/log/S2_backup/Access_Log.csv -t Access_Log

Verified conf file "90-nagioslogserver_var_log_S2_backup_Access_Log.csv.conf" is now using port 9001
Edited /etc/rsyslog.conf and now have two lines like this:
*.* @192.168.2.108:5544
*.* @192.168.2.108:9001

Verified with tcpdump that 9001 is going through on both ends.

Checked Firewall log to see if anything is being blocked - nothing. All is allowed through.

My Global configuration Inputs:
Syslog
Windows Event Log
Import -Raw
Import - JSON
TCP:
tcp {
type => 'csvinput'
port => 9001
}

My Global configuration Filters:
Apache
CSV:
if [type] == 'csvinput' {
csv {
columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
separator => ","
}
grok {
match => [ "message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$" ]
}
}



No Outputs configured.

In Dashboard I try to do a search for everything over the past 7 days (takes a while), then go down to the program field, click it, and the Access_log program is not listed. I go down and click "type" and syslog is the only type available.
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

Please verify that your configs pushed to your logstash server properly:

Code: Select all

cat /usr/local/nagioslogserver/logstash/etc/conf.d/*
Verify that your server is listening on port 9001:

Code: Select all

netstat -na | grep 9001
Your setup looks proper to me.

If everything looks good, edit the CSV file (Access_Log.csv) and replace one character with the same character - then write the file. This write seems to tell rsyslog to re-send the file, based on my lab testing. If that doesn't work, restart rsyslog:

Code: Select all

service rsyslog restart
Let me know what you find out. Thank you.
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
kconti
Posts: 33
Joined: Thu Mar 26, 2015 11:25 am

Re: NLS Log from file not working

Post by kconti »

Seems to be listening on that port:
# netstat -na | grep 9001
tcp 0 0 :::9001 :::* LISTEN

I restarted rsyslog after altering the file didn't fix it.

So this is exactly how yours is set up? You didn't specify any file location besides in the 90-nagioslogserver....conf (generated)? Maybe that conf file is not being looked at?

How can I verify Logstash is getting the information? I have a feeling it just isn't translating over to Elastisearch.

Full configuration below:
cat /usr/local/nagioslogserver/logstash/etc/conf.d/*
#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:21:42 -0400
#

#
# Global inputs
#

input {
syslog {
type => 'syslog'
port => 5544
}
tcp {
type => 'eventlog'
port => 3515
codec => json {
charset => 'CP1252'
}
}
tcp {
type => 'import_raw'
tags => 'import_raw'
port => 2056
}
tcp {
type => 'import_json'
tags => 'import_json'
port => 2057
codec => json
}
tcp {
type => 'csvinput'
port => 9001
}
}

#
# Local inputs
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:21:42 -0400
#

#
# Global filters
#

filter {
if [program] == 'apache_access' {
grok {
match => [ 'message', '%{COMBINEDAPACHELOG}']
}
date {
match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]
}
mutate {
replace => [ 'type', 'apache_access' ]
convert => [ 'bytes', 'integer' ]
convert => [ 'response', 'integer' ]
}
}

if [program] == 'apache_error' {
grok {
match => [ 'message', '\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\] \[%{WORD:class}\] \[%{WORD:originator} %{IP:cli entip}\] %{GREEDYDATA:errmsg}']
}
mutate {
replace => [ 'type', 'apache_error' ]
}
}
if [type] == 'csvinput' {
csv {
columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "L ocation", "Reader", "Card Number"]
separator => ","
}
grok {
match => [ "message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partiti on}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:Firs tName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$" ]
}
}
}

#
# Local filters
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:21:42 -0400
#

#
# Required output for Nagios Log Server
#

output {
elasticsearch {
cluster => '199525a6-0502-414f-8d1f-5a3d5e7fd90e'
host => 'localhost'
index_type => '%{type}'
node_name => '0f8f3f1d-7049-4bb0-b1d0-c1ad93b958c8'
protocol => 'transport'
workers => 4
}
}

#
# Global outputs
#



#
# Local outputs
#
jolson
Attack Rabbit
Posts: 2560
Joined: Thu Feb 12, 2015 12:40 pm

Re: NLS Log from file not working

Post by jolson »

Yes, that is exactly how mine is set up. In fact, our Logstash configurations are almost exactly the same - though yours is missing some indentation.

The exact commands I entered on Nagios Log Server to get this working:

Code: Select all

  205  vi /etc/sysconfig/iptables (edited to enable tcp/9001)
  206  service iptables restart
  207  netstat -na |grep 9001 (ensured listening port)
  208  tcpdump -n dst port 9001 (to verify packets being received)
  209  cat /usr/local/nagioslogserver/logstash/etc/conf.d/* (verified configurations)
The exact commands I entered on my test client machine:

Code: Select all

  381  bash setup-linux.sh -h
  382  ll
  383  mv Access_Log\ -\ Copy.csv csvtest.csv
  384  ll
  385  bash setup-linux.sh -s 192.168.4.203 -p 9001 -f /root/csvtest.csv -t csvtag
A cat of my rsyslog configuration:

Code: Select all

[root@nagioscore ~]# cat /etc/rsyslog.d/90-nagioslogserver_root_csvtest.csv.conf
$ModLoad imfile
$InputFilePollInterval 10
$PrivDropToGroup adm
$WorkDirectory /var/lib/rsyslog

# Input for csvtag
$InputFileName /root/csvtest.csv
$InputFileTag csvtag:
$InputFileStateFile nls-state-root_csvtest.csv # Must be unique for each file being polled
# Uncomment the folowing line to override the default severity for messages
# from this file.
#$InputFileSeverity info
$InputFilePersistStateInterval 20000
$InputRunFileMonitor

# Forward to Nagios Log Server and then discard, otherwise these messages
# will end up in the syslog file (/var/log/messages) unless there are other
# overriding rules.
if $programname == 'csvtag' then @@NLSIP:9001
if $programname == 'csvtag' then ~
Cat of my logstash configs:

Code: Select all

cat /usr/local/nagioslogserver/logstash/etc/conf.d/*
#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:01:20 -0400
#

#
# Global inputs
#

input {
    tcp {
        type => 'import_json'
        tags => 'import_json'
        port => 2057
        codec => json
    }
    tcp {
        type => 'import_raw'
        tags => 'import_raw'
        port => 2056
    }
    tcp {
        type => 'eventlog'
        port => 3515
        codec => json {
            charset => 'CP1252'
        }
    }
    syslog {
        type => 'syslog'
        port => 5544
    }
    tcp {
    type => 'csvinput'
    port => 9001
    }
}

#
# Local inputs
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:01:20 -0400
#

#
# Global filters
#

filter {
    if [type] == 'csvinput' {
        csv {
            columns => ["Partition", "Person ID", "Node Date/Time", "Date/Time", "Description", "Last Name", "First Name", "Node UID", "Node Name", "Location", "Reader", "Card Number"]
            separator => ","
        }
      grok {
        match => [ "message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$" ]
      }
    }
    if [program] == 'apache_access' {
        grok {
            match => [ 'message', '%{COMBINEDAPACHELOG}']
        }
        date {
            match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]
        }
        mutate {
            replace => [ 'type', 'apache_access' ]
             convert => [ 'bytes', 'integer' ]
             convert => [ 'response', 'integer' ]
        }
    }

    if [program] == 'apache_error' {
        grok {
            match => [ 'message', '\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\] \[%{WORD:class}\] \[%{WORD:originator} %{IP:clientip}\] %{GREEDYDATA:errmsg}']
        }
        mutate {
            replace => [ 'type', 'apache_error' ]
        }
    }
}

#
# Local filters
#


#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Thu, 09 Apr 2015 11:01:20 -0400
#

#
# Required output for Nagios Log Server
#

output {
    elasticsearch {
        cluster => '9556aad3-6ee2-4205-ab0c-48906fe3162c'
        host => 'localhost'
        index_type => '%{type}'
        node_name => '54bc3e7e-7478-4f28-bfa7-5020f6fbf0ae'
        protocol => 'transport'
        workers => 4
    }
}

#
# Global outputs
#



#
# Local outputs
#
Any chance any of the above is different on your machine?
Twits Blog
Show me a man who lives alone and has a perpetually clean kitchen, and 8 times out of 9 I'll show you a man with detestable spiritual qualities.
Locked