Page 4 of 5
Re: NLS Log from file not working
Posted: Thu Apr 16, 2015 9:18 am
by jolson
The grok match is going to match against the message field:
match => [ "
message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$" ]
messagefield.PNG
The 'message' field will need to be appropriately matched using a combination of grok patterns. You can find all available grok patterns here, and select the best ones for your purposes:
Code: Select all
http://grokdebug.herokuapp.com/patterns#
You can also debug your match pattern at the same site:
Capture5.PNG
Once you have found an appropriate pattern for your logs, please use it instead of the example that I provided. Do you have any questions about how to define your grok match pattern?
Re: NLS Log from file not working
Posted: Thu Apr 16, 2015 10:06 am
by kconti
I don't see how yours could possible work and not mine, using the same configuration. As you can see from that snippet you copied, the "message" is garbage....nothing from the csv file at all.
That tool you sent over is pretty nice, though. Your grok pattern always comes up as "no matches".
So you are saying if you were to use that tool and used the same example output as I am:
Master,_15,04/08/15 6:51,04/08/15 6:51,Access granted,Smith,Frank,550000002BD2B127,2nd Fl S2 Node,2ND FL OPEN OFFICE AREA,2ND FL - OPEN OFFICE AREA,236
And plug in your filter:
"message", "^<%{NUMBER:number}>%{MONTH:month} %{MONTHDAY:day} %{TIME:time} %{HOST:hostname} %{GREEDYDATA:something}: %{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$"
You actually get something back?
Re: NLS Log from file not working
Posted: Thu Apr 16, 2015 11:32 am
by jolson
So you are saying if you were to use that tool and used the same example output as I am:
Master,_15,04/08/15 6:51,04/08/15 6:51,Access granted,Smith,Frank,550000002BD2B127,2nd Fl S2 Node,2ND FL OPEN OFFICE AREA,2ND FL - OPEN OFFICE AREA,236
And plug in your filter, you actually get something back?
The thing to remember is that the log above:
Code: Select all
Master,_15,04/08/15 6:51,04/08/15 6:51,Access granted,Smith,Frank,550000002BD2B127,2nd Fl S2 Node,2ND FL OPEN OFFICE AREA,2ND FL - OPEN OFFICE AREA,236
Is ultimately different than the log received after rsyslog sends it, because rsyslog will add 'syslog' data to the front of every log line it sends.
If you match the above pattern to:
Code: Select all
^%{DATA:Partition}[_,]+%{DATE_EU:date2} %{HOUR:hour2}:%{MINUTE:minute2},%{DATE_EU:date3} %{HOUR:hour3}:%{MINUTE:minute3},%{DATA:Description},%{DATA:LastName},%{DATA:FirstName},%{NOTSPACE:NodeUID},%{DATA:NodeName},%{DATA:Location},%{DATA:Reader},%{NUMBER:CardNumber}$
It will match as expected.
The issue is that after rsyslog parses and sends the .csv logs, you will end up with something like:
Code: Select all
<133>Apr 15 09:26:08 testserver csvsyslogout: Master,_15,04/08/15 6:51,04/08/15 6:51,Access granted,Smith,Frank,550000002BD2B127,2nd Fl S2 Node,2ND FL OPEN OFFICE AREA,2ND FL - OPEN OFFICE AREA,236
Which does match the full pattern:
Capture.PNG
The problem here is that the full *message* field of your .csv file is not showing up. I think that we should simplify your setup to find out why that might be. I suggest deleting all filters that you have in place, and leaving only your tcp input. After making this modification, what shows up in the 'message' field?
Re: NLS Log from file not working
Posted: Thu Apr 16, 2015 5:00 pm
by kconti
So it didn't seem to fix the problem, but it did did give me a message in the logstash log:
{:timestamp=>"2015-04-16T16:24:06.770000-0400", :message=>"Trouble parsing csv", :source=>"message", :raw=>"<46>Apr 16 16:24:05 support rsyslogd: [origin software=\"rsyslogd\" swVersion=\"5.8.10\" x-pid=\"24056\" x-info=\"
http://www.rsyslog.com\"] exiting on signal 15.", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>, :level=>:warn}
{:timestamp=>"2015-04-16T16:24:06.881000-0400", :message=>"Trouble parsing csv", :source=>"message", :raw=>"<46>Apr 16 16:24:05 support rsyslogd: [origin software=\"rsyslogd\" swVersion=\"5.8.10\" x-pid=\"24110\" x-info=\"
http://www.rsyslog.com\"] start", :exception=>#<CSV::MalformedCSVError: Illegal quoting in line 1.>, :level=>:warn}
I then tried removing rsyslog altogher, and that didn't end well. logstash kept stopping, so I added all syslog configurations and input back. Only filter is csv.
I don't get where it is getting that garbage information.
Re: NLS Log from file not working
Posted: Thu Apr 16, 2015 5:05 pm
by jolson
The client is sending that information - the client rsyslog will add it to the beginning of each log. I know it can be a pain... I've tried to get rid of it to no avail. Please remove the csv filter as well. What do you see in your dashboard at this point? You should only have a simple input. We can define a filter from there.
Re: NLS Log from file not working
Posted: Fri Apr 17, 2015 11:18 am
by kconti
Configuration file:
Code: Select all
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Fri, 17 Apr 2015 12:11:45 -0400
#
#
# Global inputs
#
input {
tcp {
type => 'csvinput'
port => 9001
}
}
#
# Local inputs
#
#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Fri, 17 Apr 2015 12:11:45 -0400
#
#
# Global filters
#
#
# Local filters
#
#
# Logstash Configuration File
# Dynamically created by Nagios Log Server
#
# DO NOT EDIT THIS FILE. IT WILL BE OVERWRITTEN.
#
# Created Fri, 17 Apr 2015 12:11:45 -0400
#
#
# Required output for Nagios Log Server
#
output {
elasticsearch {
cluster => '199525a6-0502-414f-8d1f-5a3d5e7fd90e'
host => 'localhost'
index_type => '%{type}'
node_name => '0f8f3f1d-7049-4bb0-b1d0-c1ad93b958c8'
protocol => 'transport'
workers => 4
}
}
#
# Global outputs
#
#
# Local outputs
#
Output:
latest_output_no filters.jpg
Can I see your /etc/rsyslog.conf? That's the only file we really haven't compared.
Re: NLS Log from file not working
Posted: Fri Apr 17, 2015 11:51 am
by jolson
No problem, my configs are below.
/etc/rsyslog.conf:
Code: Select all
# rsyslog v5 configuration file
# For more information see /usr/share/doc/rsyslog-*/rsyslog_conf.html
# If you experience problems, see http://www.rsyslog.com/doc/troubleshoot.html
#### MODULES ####
$ModLoad imuxsock # provides support for local system logging (e.g. via logger command)
$ModLoad imklog # provides kernel logging support (previously done by rklogd)
#$ModLoad immark # provides --MARK-- message capability
# Provides UDP syslog reception
#$ModLoad imudp
#$UDPServerRun 514
# Provides TCP syslog reception
#$ModLoad imtcp
#$InputTCPServerRun 514
#### GLOBAL DIRECTIVES ####
# Use default timestamp format
$ActionFileDefaultTemplate RSYSLOG_TraditionalFileFormat
# File syncing capability is disabled by default. This feature is usually not required,
# not useful and an extreme performance hit
#$ActionFileEnableSync on
# Include all config files in /etc/rsyslog.d/
#### RULES ####
# Log all kernel messages to the console.
# Logging much else clutters up the screen.
#kern.* /dev/console
# Log anything (except mail) of level info or higher.
# Don't log private authentication messages!
*.info;mail.none;authpriv.none;cron.none /var/log/messages
# The authpriv file has restricted access.
authpriv.* /var/log/secure
# Log all the mail messages in one place.
mail.* -/var/log/maillog
# Log cron stuff
cron.* /var/log/cron
# Everybody gets emergency messages
*.emerg *
# Save news errors of level crit and higher in a special file.
uucp,news.crit /var/log/spooler
# Save boot messages also to boot.log
local7.* /var/log/boot.log
# ### begin forwarding rule ###
# The statement between the begin ... end define a SINGLE forwarding
# rule. They belong together, do NOT split them. If you create multiple
# forwarding rules, duplicate the whole block!
# Remote Logging (we use TCP for reliable delivery)
#
# An on-disk queue is created for this action. If the remote host is
# down, messages are spooled to disk and sent when it is up again.
#$WorkDirectory /var/lib/rsyslog # where to place spool files
#$ActionQueueFileName fwdRule1 # unique name prefix for spool files
#$ActionQueueMaxDiskSpace 1g # 1gb space limit (use as much as possible)
#$ActionQueueSaveOnShutdown on # save messages to disk on shutdown
#$ActionQueueType LinkedList # run asynchronously
#$ActionResumeRetryCount -1 # infinite retries if host is down
# remote host is: name/ip:port, e.g. 192.168.0.1:514, port optional
#*.* @@remote-host:514
# ### end of the forwarding rule ###
$IncludeConfig /etc/rsyslog.d/*.conf
/etc/rsyslog.d/90-nagioslogserver_root_csvtest.csv.conf:
Code: Select all
$ModLoad imfile
$InputFilePollInterval 10
$PrivDropToGroup adm
$WorkDirectory /var/lib/rsyslog
# Input for csvtest2
$InputFileName /root/csvtest.csv
$InputFileTag csvtest2:
$InputFileStateFile nls-state-root_csvtest.csv # Must be unique for each file being polled
# Uncomment the folowing line to override the default severity for messages
# from this file.
#$InputFileSeverity info
$InputFilePersistStateInterval 20000
$InputRunFileMonitor
# Forward to Nagios Log Server and then discard, otherwise these messages
# will end up in the syslog file (/var/log/messages) unless there are other
# overriding rules.
if $programname == 'csvtest2' then @@nls-server:8999
if $programname == 'csvtest2' then ~
This is the exact command I ran when setting up rsyslog:
Code: Select all
bash setup-linux.sh -s 192.168.4.203 -p 8999 -f /root/csvtest.csv -t csvtest2
One things that comes to mind is that you need to enter the full file path when specifying your .csv file - please be sure to have done so.
Does the above differ from your setup?
Re: NLS Log from file not working
Posted: Fri Apr 17, 2015 1:01 pm
by kconti
Thanks for sending that over. That is the rsyslog.conf from the NLS server right. What about the remote server's (sending server with csv file) /etc/rsyslog.conf?
Re: NLS Log from file not working
Posted: Fri Apr 17, 2015 1:07 pm
by jolson
This is on a remote server that I am using to send the .csv file over. The configuration on NLS will not matter, as it doesn't use rsyslog to pick up events - that is handled by Logstash.
Re: NLS Log from file not working
Posted: Fri Apr 17, 2015 1:13 pm
by kconti
I was missing: $IncludeConfig /etc/rsyslog.d/*.conf
Heh, I knew it was just a simple misconfiguration on my end. It is now working. There are a lot of events still with junk information in it, but I am receiving some with real data, finally (THANK YOU).
working_output.jpg
I'm sure I can work the filter in now and get this into a better format.
Many thanks!