help creating a filter

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
User avatar
benhank
Posts: 1264
Joined: Tue Apr 12, 2011 12:29 pm

Re: help creating a filter

Post by benhank »

I gave it a shot and got this error when trying to verify the configuration:

Code: Select all

{:timestamp=>"2020-06-15T12:18:53.660000-0400", :message=>"The given configuration is invalid. Reason: Expected one of #, ( at line 198, column 13 (byte 6105) after filter {\n    if [type] == 'asa' {\n      grok{\n        match => [ 'message', '<%{POSINT:syslog_pri}>%{MONTH} +%{MONTHDAY} %{YEAR} %{TIME}: %%{WORD:LogType}-%{INT:LogSeverity}-%{INT:LogMessageNumber}: Group = (%{USERNAME:Group}|%{IP:Group}), Username = %{IPORHOST:username}, IP = %{IP:IPAddress}, Session disconnected. Session Type: %{NOTSPACE:SessionType}, Duration: (%NUMBER:DurationDays}d ?)?%{NUMBER:DurationHours}h:%{NUMBER:DurationMinutes}m:%{NUMBER:DurationSeconds}s, Bytes xmt: %{INT:BytesTransmitted:int}, Bytes rcv: %{INT:BytesReceived:int}, Reason: %{GREEDYDATA:Reason}' ]\n    }\n    geoip {\n    # database => \"/usr/share/GeoIP/GeoLiteCity.dat\"\n    source => \"IPAddress\"\n    }\n    }\n    if [program] == 'apache_access' {\n        grok {\n            match => [ 'message', '%{COMBINEDAPACHELOG}']\n        }\n        date {\n            match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]\n        }\n        mutate {\n            replace => [ 'type', 'apache_access' ]\n             convert => [ 'bytes', 'integer' ]\n             convert => [ 'response', 'integer' ]\n        }\n    }\n     \n    if [program] == 'apache_error' {\n        grok {\n            match => [ 'message', '\\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\\] \\[%{WORD:class}\\] \\[%{WORD:originator} %{IP:clientip}\\] %{GREEDYDATA:errmsg}']\n        }\n        mutate {\n            replace => [ 'type', 'apache_error' ]\n        }\n    }\n        if [program] == \"mysqld_log\" {\n            grok {\n                match => [ \"message\", \"^%{NUMBER:date} *%{NOTSPACE:time}\"]\n            }\n            mutate {\n                replace => [ \"type\", \"mysqld_log\" ]\n            }\n        }\n    if [host] == '172.30.100.226' {\n    \n    grok {\n    \n    match => { \"message\" => \"result=\\\"%{WORD:result}\\\" ip=\\\"%{IP:IP}\\\" action=\\\"%{WORD:action}\\\" params=\\\"Username: %{USER:params}\\\" user=\\\"%{USER:user}\\\" tenant=\\\"%{WORD:tenant}\\\"\"}\n    match => { \"message\" => \"result=\\\"%{WORD:result}\\\" ip=\\\"%{IP:IP}\\\" action=\\\"%{WORD:action}\\\" params=\\\"Username=%{USER:params}\\\" user=\\\"%{USER:user}\\\" tenant=\\\"%{WORD:tenant}\\\"\"}\n    }\n    mutate {\n        replace => { \"Ipaddress\" => \"%{IP}\" }\n      }\n    geoip {\n    database => \"/usr/share/GeoIP/GeoLite2-City.mmdb\"\n    source => \"Ipaddress\"\n    }\n    }\n    \n    \n    \n    if [program] == 'xi_auditlog' {\n        grok {\n            match => [ 'message', '%{XIAUDITLOG_MESSAGE}' ]\n            patterns_dir => '/usr/local/nagioslogserver/etc/patterns'\n            overwrite => [ 'message' ]\n        }\n        date {\n            match => [ 'timestamp', 'yyyy-MM-dd HH:mm:ss' ]\n        }\n        mutate {\n            replace => [ 'type', 'xi_auditlog' ]\n        }\n    }\n    if [type] == 'netscaler'{\n    grok {\n                                    break_on_match => true\n                                    match => [\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{IP:source_ip}:%{POSINT:source_port} - %{DATA} %{IP:vserver_ip}:%{POSINT:vserver_port} - %{DATA} %{IP:nat_ip}:%{POSINT:nat_port} - %{DATA} %{IP:destination_ip}:%{POSINT:destination_port} - %{DATA} %{DATE_US:DELINK_DATE}:%{TIME:DELINK_TIME} GMT - %{DATA} %{POSINT:total_bytes_sent} - %{DATA} %{POSINT:total_bytes_recv}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{IP:source_ip}:%{POSINT:source_port} - %{DATA} %{IP:destination_ip}:%{POSINT:destination_port} - %{DATA} %{DATE_US:START_DATE}:%{TIME:START_TIME} GMT - %{DATA} %{DATE_US:END_DATE}:%{TIME:END_TIME} GMT - %{DATA} %{POSINT:total_bytes_sent} - %{DATA} %{POSINT:total_bytes_recv}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{INT:netscaler_spcbid} - %{DATA} %{IP:clientip} - %{DATA} %{INT:netscaler_client_port} - %{DATA} %{IP:netscaler_vserver_ip} - %{DATA} %{INT:netscaler_vserver_port} %{GREEDYDATA:netscaler_message} - %{DATA} %{WORD:netscaler_session_type}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message}\"\n                                    ]\n                            }\n    }\n    if host ", :level=>:fatal}
Proudly running:
NagiosXI 5.4.12 2 node Prod Env 2500 hosts, 13,000 services
Nagiosxi 5.5.7(test env) 2500 hosts, 13,000 services
Nagios Logserver 2 node Prod Env 500 objects sending
Nagios Network Analyser
Nagios Fusion
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: help creating a filter

Post by cdienger »

The KV filter appears to need a bit more massaging that I haven't quite figured out yet, but this grok seems to parse things out:

Code: Select all

grok {
match => [ "message", "%{NUMBER:number},%{YEAR:year}/%{MONTHNUM:month}/%{MONTHDAY:day} %{TIME:time},%{NUMBER:number},%{WORD:word},%{WORD:word},%{NUMBER:number},%{YEAR:year}/%{MONTHNUM:month}/%{MONTHDAY:day} %{TIME:time},,%{DATA:data},%{DATA:data},%{NUMBER:number},%{NUMBER:number},%{WORD:word},%{WORD:word},%{QUOTEDSTRING:quotedstring},%{NUMBER:number},%{DATA:data},%{NUMBER:number},%{NUMBER:number},%{NUMBER:number},%{NUMBER:number},,%{GREEDYDATA:therestofit}" ]
}
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
User avatar
benhank
Posts: 1264
Joined: Tue Apr 12, 2011 12:29 pm

Re: help creating a filter

Post by benhank »

here is the error I get Ive added the full filter file (without the changes):

Code: Select all

klpl \{:timestamp=>"2020-06-16T20:09:27.766000-0400", :message=>"The given configuration is invalid. Reason: Expected one of #, ( at line 198, column 13 (byte 6105) after filter {\n    if [type] == 'asa' {\n      grok{\n        match => [ 'message', '<%{POSINT:syslog_pri}>%{MONTH} +%{MONTHDAY} %{YEAR} %{TIME}: %%{WORD:LogType}-%{INT:LogSeverity}-%{INT:LogMessageNumber}: Group = (%{USERNAME:Group}|%{IP:Group}), Username = %{IPORHOST:username}, IP = %{IP:IPAddress}, Session disconnected. Session Type: %{NOTSPACE:SessionType}, Duration: (%NUMBER:DurationDays}d ?)?%{NUMBER:DurationHours}h:%{NUMBER:DurationMinutes}m:%{NUMBER:DurationSeconds}s, Bytes xmt: %{INT:BytesTransmitted:int}, Bytes rcv: %{INT:BytesReceived:int}, Reason: %{GREEDYDATA:Reason}' ]\n    }\n    geoip {\n    # database => \"/usr/share/GeoIP/GeoLiteCity.dat\"\n    source => \"IPAddress\"\n    }\n    }\n    if [program] == 'apache_access' {\n        grok {\n            match => [ 'message', '%{COMBINEDAPACHELOG}']\n        }\n        date {\n            match => [ 'timestamp', 'dd/MMM/yyyy:HH:mm:ss Z' ]\n        }\n        mutate {\n            replace => [ 'type', 'apache_access' ]\n             convert => [ 'bytes', 'integer' ]\n             convert => [ 'response', 'integer' ]\n        }\n    }\n     \n    if [program] == 'apache_error' {\n        grok {\n            match => [ 'message', '\\[(?<timestamp>%{DAY:day} %{MONTH:month} %{MONTHDAY} %{TIME} %{YEAR})\\] \\[%{WORD:class}\\] \\[%{WORD:originator} %{IP:clientip}\\] %{GREEDYDATA:errmsg}']\n        }\n        mutate {\n            replace => [ 'type', 'apache_error' ]\n        }\n    }\n        if [program] == \"mysqld_log\" {\n            grok {\n                match => [ \"message\", \"^%{NUMBER:date} *%{NOTSPACE:time}\"]\n            }\n            mutate {\n                replace => [ \"type\", \"mysqld_log\" ]\n            }\n        }\n    if [host] == '172.30.100.226' {\n    \n    grok {\n    \n    match => { \"message\" => \"result=\\\"%{WORD:result}\\\" ip=\\\"%{IP:IP}\\\" action=\\\"%{WORD:action}\\\" params=\\\"Username: %{USER:params}\\\" user=\\\"%{USER:user}\\\" tenant=\\\"%{WORD:tenant}\\\"\"}\n    match => { \"message\" => \"result=\\\"%{WORD:result}\\\" ip=\\\"%{IP:IP}\\\" action=\\\"%{WORD:action}\\\" params=\\\"Username=%{USER:params}\\\" user=\\\"%{USER:user}\\\" tenant=\\\"%{WORD:tenant}\\\"\"}\n    }\n    mutate {\n        replace => { \"Ipaddress\" => \"%{IP}\" }\n      }\n    geoip {\n    database => \"/usr/share/GeoIP/GeoLite2-City.mmdb\"\n    source => \"Ipaddress\"\n    }\n    }\n    \n    \n    \n    if [program] == 'xi_auditlog' {\n        grok {\n            match => [ 'message', '%{XIAUDITLOG_MESSAGE}' ]\n            patterns_dir => '/usr/local/nagioslogserver/etc/patterns'\n            overwrite => [ 'message' ]\n        }\n        date {\n            match => [ 'timestamp', 'yyyy-MM-dd HH:mm:ss' ]\n        }\n        mutate {\n            replace => [ 'type', 'xi_auditlog' ]\n        }\n    }\n    if [type] == 'netscaler'{\n    grok {\n                                    break_on_match => true\n                                    match => [\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{IP:source_ip}:%{POSINT:source_port} - %{DATA} %{IP:vserver_ip}:%{POSINT:vserver_port} - %{DATA} %{IP:nat_ip}:%{POSINT:nat_port} - %{DATA} %{IP:destination_ip}:%{POSINT:destination_port} - %{DATA} %{DATE_US:DELINK_DATE}:%{TIME:DELINK_TIME} GMT - %{DATA} %{POSINT:total_bytes_sent} - %{DATA} %{POSINT:total_bytes_recv}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{IP:source_ip}:%{POSINT:source_port} - %{DATA} %{IP:destination_ip}:%{POSINT:destination_port} - %{DATA} %{DATE_US:START_DATE}:%{TIME:START_TIME} GMT - %{DATA} %{DATE_US:END_DATE}:%{TIME:END_TIME} GMT - %{DATA} %{POSINT:total_bytes_sent} - %{DATA} %{POSINT:total_bytes_recv}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message} : %{DATA} %{INT:netscaler_spcbid} - %{DATA} %{IP:clientip} - %{DATA} %{INT:netscaler_client_port} - %{DATA} %{IP:netscaler_vserver_ip} - %{DATA} %{INT:netscaler_vserver_port} %{GREEDYDATA:netscaler_message} - %{DATA} %{WORD:netscaler_session_type}\",\n                                            \"message\", \"<%{POSINT:syslog_pri}> %{DATE_US}:%{TIME} GMT %{SYSLOGHOST:syslog_hostname} %{GREEDYDATA:netscaler_message}\"\n                                    ]\n        
You do not have the required permissions to view the files attached to this post.
Proudly running:
NagiosXI 5.4.12 2 node Prod Env 2500 hosts, 13,000 services
Nagiosxi 5.5.7(test env) 2500 hosts, 13,000 services
Nagios Logserver 2 node Prod Env 500 objects sending
Nagios Network Analyser
Nagios Fusion
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: help creating a filter

Post by cdienger »

Missing a { in :

Code: Select all

match => [ 'message', '<%{POSINT:syslog_pri}>%{MONTH} +%{MONTHDAY} %{YEAR} %{TIME}: %%{WORD:LogType}-%{INT:LogSeverity}-%{INT:LogMessageNumber}: Group = (%{USERNAME:Group}|%{IP:Group}), Username = %{IPORHOST:username}, IP = %{IP:IPAddress}, Session disconnected. Session Type: %{NOTSPACE:SessionType}, Duration: (%NUMBER:DurationDays}d ?)?%{NUMBER:DurationHours}h:%{NUMBER:DurationMinutes}m:%{NUMBER:DurationSeconds}s, Bytes xmt: %{INT:BytesTransmitted:int}, Bytes rcv: %{INT:BytesReceived:int}, Reason: %{GREEDYDATA:Reason}' ]
I think it should be:

Code: Select all

match => [ 'message', '<%{POSINT:syslog_pri}>%{MONTH} +%{MONTHDAY} %{YEAR} %{TIME}: %%{WORD:LogType}-%{INT:LogSeverity}-%{INT:LogMessageNumber}: Group = (%{USERNAME:Group}|%{IP:Group}), Username = %{IPORHOST:username}, IP = %{IP:IPAddress}, Session disconnected. Session Type: %{NOTSPACE:SessionType}, Duration: (%{NUMBER:DurationDays}d ?)?%{NUMBER:DurationHours}h:%{NUMBER:DurationMinutes}m:%{NUMBER:DurationSeconds}s, Bytes xmt: %{INT:BytesTransmitted:int}, Bytes rcv: %{INT:BytesReceived:int}, Reason: %{GREEDYDATA:Reason}' ]
The { was added for the Duration field:

Code: Select all

Duration: (%{NUMBER:DurationDays}d ?)
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
User avatar
benhank
Posts: 1264
Joined: Tue Apr 12, 2011 12:29 pm

Re: help creating a filter

Post by benhank »

I made the changes and it's looking good!
After I get some data I'll post a screen shot
Proudly running:
NagiosXI 5.4.12 2 node Prod Env 2500 hosts, 13,000 services
Nagiosxi 5.5.7(test env) 2500 hosts, 13,000 services
Nagios Logserver 2 node Prod Env 500 objects sending
Nagios Network Analyser
Nagios Fusion
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: help creating a filter

Post by cdienger »

Sounds good. Keep us posted :)
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
User avatar
benhank
Posts: 1264
Joined: Tue Apr 12, 2011 12:29 pm

Re: help creating a filter

Post by benhank »

Ok since I fully understand that I am asking to be taught how to do grok filtering at this this point, take your time in responding I know you guys have a ton of mission critical issues to resolve and this isn't one of em =D.
I am asking these questions here because I dont have test servers for NLS any mistakes i make are there to stay.

(psst feel free to accidentally pm me the book you fellas used to learn all this btw...what happens in pm's stays in pm's >=D)
so here we go:
question 1.
lets take the logs that we have used so far:

Code: Select all

<14>Jun 11 10:01:25 Panorama.mycompany.org 1,2020/06/11 10:01:25,013101007499,SYSTEM,globalprotect,0,2020/06/11 10:01:20,,globalprotectgateway-config-succ,mycompany2-GW-N,0,0,general,informational,"GlobalProtect gateway client configuration generated. User name: lskywalker, Private IP: 123.123.123.123,  Client region: US, Client IP: 123.123.123.123, Client version: 5.0.7-2, Device name: AHNITERP0002L, Client OS version: Microsoft Windows 7 Enterprise Edition Service Pack 1, 64-bit, VPN type: Device Level VPN.",6822056328467525919,0x8000000000000000,0,0,0,0,,MBO-PA-1234-2

09:55:39,,globalprotectgateway-config-release,mycompany2-GW-N,0,0,general,informational,"GlobalProtect gateway client configuration released. User name: lskywalker, Private IP: 123.123.123.123, Client version: 5.0.7-2, Device name: AHNITHDSK006L, Client OS version: Microsoft Windows 7 Enterprise Edition Service Pack 1, 64-bit, VPN type: Device Level VPN.",6822062779510505745,0x8000000000000000,0,0,0,0,,MBO-PA-1234-1

<14>Jun 10 11:20:07 Panorama.mycompany.org 1,2020/06/10 11:20:07,013101007502,SYSTEM,auth,0,2020/06/10 11:19:59,,auth-success,DUO VIP with failback,0,0,general,informational,"When authenticating 
user 'lskywalker' from 123.123.123.123', a less secure authentication method PAP is used. Please migrate to PEAP or EAP-TTLS. Authentication Profile 'DUO VIP with failback', vsys 'vsys1', Server Profile 'DUO RADIUS - VIP', Server Address '123.123.123.123'",6822062779510453156,0x8000000000000000,0,0,0,0,,MBO-PA-1234-1


<14>Jun 10 11:19:32 Panorama.mycompany.org 1,2020/06/10 11:19:32,013101007502,SYSTEM,globalprotect,0,2020/06/10 11:19:26,,globalprotectportal-auth-fail,mycompany2-Portal,0,0,general,informational,"GlobalProtect portal user authentication failed. Login from: 123.123.123.123, Source region: US, User name: lskywalker, Client OS version: Microsoft Windows 7 Enterprise Edition Service Pack 1, 64-bit, Reason: Authentication failed: Invalid username or password, Auth type: profile.",6822062779510453137,0x8000000000000000,0,0,0,0,,MBO-PA-1234-1

<14>Jun 11 10:01:22 Panorama.mycompany.org 1,2020/06/11 10:01:22,013101007502,SYSTEM,globalprotect,0,2020/06/11 10:01:20,,globalprotectgateway-regist-fail,mycompany2-GW-N,0,0,general,informational,"GlobalProtect gateway user login failed. Login from: 123.123.123.123, Source region: US, User name: lskywalker, Client OS version: Microsoft Windows 7 Enterprise Edition Service Pack 1, 64-bit, error: Existing user session found.",6822062779510506095,0x8000000000000000,0,0,0,0,,MBO-PA-1234-1
If I use the grokk debugger Discover feature on those five lines I get:

Code: Select all

%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectgateway-config-succ,mycompany2-GW-N,0,0,general,informational,%{QS},6822056328467525919,0x8000000000000000,0,0,0,0,,MBO-PA%{ISO8601_TIMEZONE}-2

%{HAPROXYTIME},,globalprotectgateway-config-release,mycompany2-GW-N,0,0,general,informational,%{QS},6822062779510505745,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}

%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,auth,0,20%{DATESTAMP},,auth-success,DUO VIP with failback,0,0,general,informational,%{QS},6822062779510453156,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}


%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectportal-auth-fail,mycompany2-Portal,0,0,general,informational,%{QS},6822062779510453137,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}

%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectgateway-regist-fail,mycompany2-GW-N,0,0,general,informational,%{QS},6822062779510506095,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}
If I created five fivers for a single server, would logstash figure out which filter is a match from the 5 options I give it or do I have to create a filter that will somehow become a catch all for an individual host? in other words what is the best method to create a filter for a host that has multiple log messages?

Question 2.
How would I apply a newly created filter to old logs?

Question 3.
Is it possible to specify multiple individual IP's or an ip range in the :

Code: Select all

if [host] ==  '123.123.123

line?
Question 4 in 2 parts:
How do I find out which plugins for logstash are installed on my server?
and
Do the plugins that were created for later versions of logstash work with the version that ships with NLS?

Thanks for reading all this !
Proudly running:
NagiosXI 5.4.12 2 node Prod Env 2500 hosts, 13,000 services
Nagiosxi 5.5.7(test env) 2500 hosts, 13,000 services
Nagios Logserver 2 node Prod Env 500 objects sending
Nagios Network Analyser
Nagios Fusion
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: help creating a filter

Post by cdienger »

1. I'm a fan of using separate inputs for each log format if possible, but if you could specify multiple patterns with match:

Code: Select all

grok {
	match => {
		"message" => [
    "%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectgateway-config-succ,mycompany2-GW-N,0,0,general,informational,%{QS},6822056328467525919,0x8000000000000000,0,0,0,0,,MBO-PA%{ISO8601_TIMEZONE}-2", "%{HAPROXYTIME},,globalprotectgateway-config-release,mycompany2-GW-N,0,0,general,informational,%{QS},6822062779510505745,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}","%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS}1,20%{DATESTAMP},SYSTEM,auth,0,20%{DATESTAMP},,auth-success,DUO VIP with failback,0,0,general,informational,%{QS},6822062779510453156,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}","%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectportal-auth-fail,mycompany2-Portal,0,0,general,informational,%{QS},6822062779510453137,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}","%{SYSLOG5424PRI}%{CISCOTIMESTAMP} %{JAVACLASS} 1,20%{DATESTAMP},SYSTEM,globalprotect,0,20%{DATESTAMP},,globalprotectgateway-regist-fail,mycompany2-GW-N,0,0,general,informational,%{QS},6822062779510506095,0x8000000000000000,0,0,0,0,,MBO-%{CISCOTAG}"
	]
	}
}
It can get a little messy this way which is why I prefer separate inputs.

Also, the discovery option is nice but doesn't always work as expected. For example - Panorama.mycompany.org matches the %{JAVACLASS} pattern which may work here but it's hard to say if this will match with all the possible values that could potentially be in the field.

2. The filters would only apply to data coming after the filters are in place. It would need to be resent somehow, may require a new input, and possible addition or tweaking of filters. Another post altogether.

3. Something like:

Code: Select all

if [host] == "123.123.123.123" or [host] == "123.123.123.124"
or:

Code: Select all

if [host] =~ "123.123.123.*"
The above would only work though if the IP has a 8,16, or 24bit subnet. Something more complex would need something like:

Code: Select all

cidr {
        add_tag => [ "ADDEDTAG" ]
        address => [ "%{host}" ]
        network => [ "123.123.123.0/23" ]
      }
You could then filter on the tag:

Code: Select all

if [ADDEDTAG] { ... }
Couple useful links:

https://www.elastic.co/guide/en/logstas ... -cidr.html
https://www.elastic.co/guide/en/logstas ... ation.html

4. You can run:

Code: Select all

/usr/local/nagioslogserver/logstash/bin/logstash-plugin list --installed --verbose
Newer plugins may work but you'd have to check the plugin notes.
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
User avatar
benhank
Posts: 1264
Joined: Tue Apr 12, 2011 12:29 pm

Re: help creating a filter

Post by benhank »

I looked at my installed plugins but i didnt see the kv filter:

Code: Select all

[root@lkenmycroftp01 ~]# /usr/local/nagioslogserver/logstash/bin/logstash-plugin list --verbose

logstash-codec-avro (2.0.4)
logstash-codec-cef (2.1.3)
logstash-codec-cloudfront (2.0.4)
logstash-codec-collectd (2.0.4)
logstash-codec-compress_spooler (2.0.4)
logstash-codec-csv (0.1.2)
logstash-codec-dots (2.0.4)
logstash-codec-edn (2.0.4)
logstash-codec-edn_lines (2.0.4)
logstash-codec-es_bulk (2.0.4)
logstash-codec-fluent (2.0.4)
logstash-codec-graphite (2.0.4)
logstash-codec-gzip_lines (2.0.4)
logstash-codec-json (2.1.4)
logstash-codec-json_lines (2.1.3)
logstash-codec-line (2.1.2)
logstash-codec-msgpack (2.0.4)
logstash-codec-multiline (2.0.11)
logstash-codec-netflow (2.1.1)
logstash-codec-nmap (0.0.18)
logstash-codec-oldlogstashjson (2.0.4)
logstash-codec-plain (2.0.4)
logstash-codec-protobuf (0.1.3)
logstash-codec-rubydebug (2.0.7)
logstash-codec-s3plain (2.0.4)
logstash-codec-sflow (1.2.1)
logstash-filter-age (1.0.0)
logstash-filter-aggregate (2.5.1)
logstash-filter-alter (2.0.5)
logstash-filter-anonymize (2.0.4)
logstash-filter-checksum (2.0.4)
logstash-filter-cidr (2.0.4)
logstash-filter-cipher (2.0.5)
logstash-filter-clone (2.0.6)
logstash-filter-cloudfoundry (0.4.0)
logstash-filter-collate (2.0.4)
logstash-filter-csv (2.1.3)
logstash-filter-date (2.1.6)
logstash-filter-de_dot (1.0.0)
logstash-filter-dissect (1.0.6)
logstash-filter-dns (2.1.3)
logstash-filter-drop (2.0.4)
logstash-filter-elapsed (3.0.2)
logstash-filter-elasticsearch (2.1.1)
logstash-filter-environment (2.0.6)
logstash-filter-extractnumbers (2.0.4)
logstash-filter-fingerprint (2.0.5)
logstash-filter-geoip (4.0.4)
logstash-filter-grok (2.0.5)
logstash-filter-i18n (2.0.4)
logstash-filter-json (2.0.6)
logstash-filter-json_encode (2.0.4)
logstash-filter-kv (2.1.0)
logstash-filter-lookup (2.0.0)
logstash-filter-math (0.2)
logstash-filter-metaevent (2.0.4)
logstash-filter-metricize (2.0.4)
logstash-filter-metrics (3.0.2)
logstash-filter-multiline (2.0.5)
logstash-filter-mutate (2.0.6)
logstash-filter-oui (2.0.4)
logstash-filter-prune (2.0.6)
logstash-filter-punct (2.0.4)
logstash-filter-range (2.0.4)
logstash-filter-ruby (2.0.5)
logstash-filter-sleep (2.0.4)
logstash-filter-split (2.0.5)
logstash-filter-syslog_pri (2.0.4)
logstash-filter-throttle (2.0.4)
logstash-filter-tld (2.0.4)
logstash-filter-translate (2.1.4)
logstash-filter-truncate (1.0.0)
logstash-filter-unique (2.0.4)
logstash-filter-urldecode (2.0.4)
logstash-filter-useragent (2.0.8)
logstash-filter-uuid (2.0.5)
logstash-filter-xml (2.2.0)
logstash-filter-zeromq (2.1.1)
logstash-input-beats (3.1.14)
logstash-input-cloudwatch (1.1.3)
logstash-input-couchdb_changes (2.0.4)
logstash-input-elasticsearch (2.0.5)
logstash-input-eventlog (3.0.3)
logstash-input-exec (2.0.6)
logstash-input-file (2.2.5)
logstash-input-fluentd (2.0.4)
logstash-input-ganglia (2.0.6)
logstash-input-gelf (2.0.7)
logstash-input-gemfire (2.0.4)
logstash-input-generator (2.0.4)
logstash-input-github (2.0.5)
logstash-input-google_pubsub (0.9.0)
logstash-input-graphite (2.0.7)
logstash-input-heartbeat (2.0.4)
logstash-input-http (2.2.3)
logstash-input-http_poller (2.1.0)
logstash-input-imap (2.0.5)
logstash-input-irc (2.0.5)
logstash-input-jdbc (3.1.0)
logstash-input-jms (2.0.4)
logstash-input-jmx (2.0.4)
logstash-input-kafka (2.1.0)
logstash-input-kinesis (1.6.0)
logstash-input-log4j (2.0.7)
logstash-input-lumberjack (2.0.7)
logstash-input-meetup (2.0.4)
logstash-input-pipe (2.0.4)
logstash-input-puppet_facter (2.0.4)
logstash-input-rabbitmq (4.1.0)
logstash-input-redis (2.0.6)
logstash-input-relp (2.0.5)
logstash-input-rss (2.0.5)
logstash-input-s3 (2.0.6)
logstash-input-salesforce (2.0.4)
logstash-input-snmptrap (2.0.4)
logstash-input-sqlite (2.0.4)
logstash-input-sqs (2.0.5)
logstash-input-stdin (2.0.4)
logstash-input-stomp (2.0.5)
logstash-input-syslog (2.0.5)
logstash-input-tcp (4.2.4)
logstash-input-twitter (2.2.2)
logstash-input-udp (2.0.5)
logstash-input-unix (2.0.6)
logstash-input-varnishlog (2.0.4)
logstash-input-websocket (3.0.2)
logstash-input-wmi (2.0.5)
logstash-input-xmpp (2.0.5)
logstash-input-zenoss (2.0.4)
logstash-input-zeromq (2.0.4)
logstash-output-boundary (2.0.4)
logstash-output-circonus (2.0.4)
logstash-output-cloudwatch (2.0.4)
logstash-output-csv (2.0.5)
logstash-output-datadog (2.0.4)
logstash-output-datadog_metrics (2.0.4)
logstash-output-elasticsearch (2.7.1)
logstash-output-elasticsearch-ec2 (2.0.4)
logstash-output-elasticsearch_java (2.1.3)
logstash-output-email (3.0.5)
logstash-output-exec (2.0.5)
logstash-output-file (2.2.5)
logstash-output-ganglia (2.0.4)
logstash-output-gelf (2.0.5)
logstash-output-google_bigquery (3.0.1)
logstash-output-graphite (2.0.5)
logstash-output-graphtastic (2.0.4)
logstash-output-hipchat (3.0.4)
logstash-output-http (2.1.3)
logstash-output-influxdb (3.1.2)
logstash-output-irc (2.0.4)
logstash-output-jms (2.0.4)
logstash-output-juggernaut (2.0.4)
logstash-output-kafka (2.0.5)
logstash-output-librato (2.0.4)
logstash-output-loggly (2.0.5)
logstash-output-lumberjack (2.0.6)
logstash-output-metriccatcher (2.0.4)
logstash-output-monasca_log_api (0.5.3)
logstash-output-mongodb (2.0.5)
logstash-output-nagios (2.0.4)
logstash-output-nagios_nsca (2.0.5)
logstash-output-null (2.0.4)
logstash-output-opentsdb (2.0.4)
logstash-output-pagerduty (2.0.4)
logstash-output-pipe (2.0.4)
logstash-output-rabbitmq (3.1.2)
logstash-output-rados (1.0.1)
logstash-output-redis (2.0.5)
logstash-output-redmine (2.0.4)
logstash-output-riemann (2.0.5)
logstash-output-s3 (2.0.7)
logstash-output-sns (3.0.4)
logstash-output-solr_http (2.0.4)
logstash-output-sqs (2.0.5)
logstash-output-statsd (2.0.7)
logstash-output-stdout (2.0.6)
logstash-output-stomp (2.0.5)
logstash-output-syslog (2.1.4)
logstash-output-tcp (2.0.4)
logstash-output-udp (2.0.4)
logstash-output-websocket (2.0.4)
logstash-output-xmpp (2.0.4)
logstash-output-zabbix (2.0.2)
logstash-output-zeromq (2.1.0)
logstash-output-zookeeper (1.0.0)
logstash-patterns-core (2.0.5)
how do I safely install it?
Proudly running:
NagiosXI 5.4.12 2 node Prod Env 2500 hosts, 13,000 services
Nagiosxi 5.5.7(test env) 2500 hosts, 13,000 services
Nagios Logserver 2 node Prod Env 500 objects sending
Nagios Network Analyser
Nagios Fusion
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: help creating a filter

Post by cdienger »

It's in there. Run it through grep to weed it out:

Code: Select all

/usr/local/nagioslogserver/logstash/bin/logstash-plugin list --installed --verbose | grep kv
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
Locked