Hi,
i've got a log file with the following content:
2016-11-23 03:00:14.651 - 00000001;{00000000-0000-0000-0000-000000000000};Print Manager Started
2016-11-23 03:00:14.714 - 00000000;{00000000-0000-0000-0000-000000000000};Synchronization: APS information restored
When i try to parse these lines with grok, i always get a grokparse error, i've come up with the following filter to try to debug:
if [type] == "srvprintrp-momaps" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601} \- %{BASE10NUM}\;\{00000000\-0000\-0000\-0000\-000000000000\}\;%{GREEDYDATA:info1}" ]
add_tag => "grokked_srvprintrp1"
}
grok {
match => [ "message", "%{YEAR:Year}-%{GREEDYDATA:info2}" ]
add_tag => "grokked_srvprintrp2"
}
}
However, the first IF clause always fails,
when i use: match => [ "message", "%{GREEDYDATA:Data}" ]
everything 'works'
when i added the parse the year: match => [ "message", "%{YEAR:Year}-%{GREEDYDATA:Data}" ]
it would fail again.
I have no idea to debug this simple log further?
parse log file with grok
Re: parse log file with grok
It's not entirely clear what you're trying to do with this grok filter. There are some syntax errors and consistency issues throughout.
If you can tell me how you would like to break this message down, I would be happy to assist with writing a grok filter for the use case.
If you can tell me how you would like to break this message down, I would be happy to assist with writing a grok filter for the use case.
Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/
Re: parse log file with grok
He wants the logs parsed in a few fields. The logs come in as type "srvprintrp-momaps".
Maybe something like this?
The weird thing is that the logs:
match in tools like http://grokconstructor.appspot.com/do/match
but we keep getting grokfailures.
Grtz
Willem
Maybe something like this?
Code: Select all
if [type] == "srvprintrp-momaps" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logtimestamp} \- %{BASE10NUM:logsequence}\;\{00000000\-0000\-0000\-0000\-000000000000\:logid}\;%{GREEDYDATA:logmessage}" ]
add_tag => "grokked_srvprintrp1"
}
}Code: Select all
2016-11-23 03:00:14.651 - 00000001;{00000000-0000-0000-0000-000000000000};Print Manager Started
2016-11-23 03:00:14.714 - 00000000;{00000000-0000-0000-0000-000000000000};Synchronization: APS information restoredbut we keep getting grokfailures.
Grtz
Willem
Nagios XI 5.8.1
https://outsideit.net
https://outsideit.net
Re: parse log file with grok
It's probably an issue with escaping.
I had luck with the following filter:
Which, using the following source message:
Produced the following event:
I had luck with the following filter:
Code: Select all
if [type] == "srvprintrp-momaps" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logtimestamp} - %{BASE10NUM:logsequence};\{(?<logid>[0-9]{8}-[0-9]{4}-[0-9]{4}-[0-9]{4}-[0-9]{12})\};%{GREEDYDATA:logmessage}" ]
add_tag => "grokked_srvprintrp1"
}
}
Code: Select all
2016-11-23 03:00:14.651 - 00000001;{00000000-0000-0000-0000-000000000000};My test messsage: some other stuff [but also useful]
You do not have the required permissions to view the files attached to this post.
Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/
-
DigNetwerk
- Posts: 40
- Joined: Fri Oct 25, 2013 7:29 am
Re: parse log file with grok
i'm now using the following grok filter, but still it shows up as 'failed':
I still can't understand why it would also fail on the second grok parse...
Some more info:
The input filter in naglog:
the nxlog config file: (Note, the logs are in Unicode)
EDIT:
I have changed the nxlog to:
and it now parses the grok filter, but now i see in the fields SourceModuleName & SourceModuleType, some strange characters.
Code: Select all
if [type] == "srvprintrp-momaps" {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logtimestamp} - %{BASE10NUM:logsequence};\{(?<logid>[0-9]{8}-[0-9]{4}-[0-9]{4}-[0-9]{4}-[0-9]{12})\};%{GREEDYDATA:logmessage}" ]
add_tag => "grokked_srvprintrp1"
}
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:logtimestamp} - %{GREEDYDATA:Data}" ]
add_tag => "grokked_srvprintrp2"
}
}Some more info:
The input filter in naglog:
Code: Select all
tcp {
type => 'srvprintrp-momaps'
port => 5612
codec => json {
charset => 'CP1252'
}
}Code: Select all
<Input file2>
Module im_file
File "C:\Program Files (x86)\uniFLOW Remote Print Server\Data\MomAps_*.Log"
ReadFromLast True
SavePos True
Exec $message = $raw_event; to_json();
</Input>
<Output out2>
Module om_tcp
Host 10.54.25.140
Port 5612
#Exec $hostname = hostname(); $raw_event = $Hostname + " " + $raw_event;
Exec $raw_event = to_json();
</Output>
<Route 2>
Path file2 => out2
</Route>
EDIT:
I have changed the nxlog to:
Code: Select all
<Input file2>
Module im_file
Exec convert_fields("UTF-16LE","UTF-8"); if $raw_event == "" drop();
File "C:\Program Files (x86)\uniFLOW Remote Print Server\Data\MomAps_*.Log"
ReadFromLast True
SavePos True
Exec $message = $raw_event; to_json();
</Input>You do not have the required permissions to view the files attached to this post.
Re: parse log file with grok
The strange characters are probably a combination of your input rule:
And your nxlog object definition:
The input rule for Nagios Log Server is expecting the message to be encoded in CP1252 (charset => 'CP1252'), but you are converting the fields to UTF-8 in your nxlog definition (convert_fields("UTF-16LE","UTF-8");). This likely confuses logstash since it is expecting CP1252 but receiving UTF-8.
Code: Select all
tcp {
type => 'srvprintrp-momaps'
port => 5612
codec => json {
charset => 'CP1252'
}
}Code: Select all
<Input file2>
Module im_file
Exec convert_fields("UTF-16LE","UTF-8"); if $raw_event == "" drop();
File "C:\Program Files (x86)\uniFLOW Remote Print Server\Data\MomAps_*.Log"
ReadFromLast True
SavePos True
Exec $message = $raw_event; to_json();
</Input>Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/