import from file

This support forum board is for support questions relating to Nagios Log Server, our solution for managing and monitoring critical log data.
Locked
harveyt
Posts: 3
Joined: Mon Sep 17, 2018 2:04 pm

import from file

Post by harveyt »

While using the import from file function
"cat <some file name.xml> | python shipper.py syslog_program:AUREPM_DB_AUD | nc xxx.xxx.xxx.xxx 2057"
it would appear that the file is not being imported in order.

Here is the file contents:
-----BEGIN /emspro/admin/export/cmspro/audit/cmspro_pmon_19690_20190116075707931217374332.xml-----
<?xml version="1.0" encoding="UTF-8"?>
<Audit xmlns="http://xmlns.oracle.com/oracleas/schema ... l-11_2.xsd"
xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
xsi:schemaLocation="http://xmlns.oracle.com/oracleas/schema ... l-11_2.xsd">
<Version>11.2</Version>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T13:57:07.935610Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T18:03:33.565170Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T18:23:32.665869Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T20:29:59.835455Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T22:42:36.559344Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-17T13:51:25.899583Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-17T15:34:01.952465Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-17T16:44:44.971339Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-17T20:24:49.044360Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-18T07:38:52.878202Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-18T18:35:04.150208Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-18T20:37:37.263708Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-18T20:46:41.536937Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-18T22:53:46.579045Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-19T07:54:39.647894Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-20T07:58:29.992684Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-20T16:01:47.597532Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-21T15:27:18.488246Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>
</Audit>
-----END /emspro/admin/export/cmspro/audit/cmspro_pmon_19690_20190116075707931217374332.xml-----

My dashboard query is "*AUREPM_DB_AUD*".

And the attached picture is what I end up with.

I have tens of tens of thousands of database audit logs that I must maintain for a year, and I need this tool to reflect exactly what is in each log file.

How can this be fixed?
Is this sort of thing happening to every other file that NLS is managing for me (i.e. those files other than manually imported) ?

THanks! in advance for any insight...
You do not have the required permissions to view the files attached to this post.
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: import from file

Post by cdienger »

This is likely due to the multiple workers logstash uses to parse data - the workers work independently from each other will parse and insert data as it comes in and not necessary in the order they appear in files. This is because it isn't mean to simply store a replica of a file - it is meant to parse and make sense of data and it displays data based on the @timestamp(when it was inserted in this case).

I see a couple possibilities as far as parsing. We could come up with something that parses the audit records and pulls the time from those records so they are inserted in order. In this case we would be pulling lines like:

<AuditRecord><Audit_Type>1</Audit_Type><Extended_Timestamp>2019-01-16T18:03:33.565170Z</Extended_Timestamp><OS_Process>19690</OS_Process><Action>102</Action><Returncode>0</Returncode>
</AuditRecord>


Is this acceptable for your case?
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
harveyt
Posts: 3
Joined: Mon Sep 17, 2018 2:04 pm

Re: import from file

Post by harveyt »

At first glance it seems that a parser that uses the extended timestamp tag will work for each audit record...so long as the audit record closing tag is matched (they may or may not be on the same file line as the start tag). I also need to ensure that the ---BEGIN... and ---END lines are maintained "in order" so that they are matched with the audit file contents. Let's give it a go and see if it works for us.

Here is a quick look at what I'm doing, please let me know if you need more info:
We have lots of Oracle DBs that create small daily audit logs in large numbers (300 to >1k per day) for each DB. In order to get these log files into NLS for both perusal and archiving I concatenate each to a "master file" with the BEGIN and END lines indicating the log file name; then import the big master file. We need to keep these things "tidy" in the event we get audited so that we can show the auditors that we can logically trace an event through our logs. The logged data being out-of-order was a concern and when that same data was pulled again, the order was changed yet again. That, suffice it to say, caused more than a few eyebrows to raise.

I need some assurance that the files we retrieve via the rsyslog function are not experiencing this same recall "randomness"...but based on your response I strongly suspect that we may need another solution for those logs as well.
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: import from file

Post by cdienger »

The syslog files shouldn't be an issue since those should largely be in chronological order and their timestamps recognized so they come up in order.

Unfortunately I don't see an easy way to associate the events in the db logs with the filenames. That said, you could import the logs and parse them for searching and keep the raw files elsewhere in case there's a need to go really in depth with them. I came up with the following filter to parse the AuditRecord lines, drop everything else, and correct the timestamp:

Code: Select all

if [type] == 'import_json' {
grok {
match => { "message" => "<AuditRecord><Audit_Type>%{INT:Audit_Type}</Audit_Type><Extended_Timestamp>%{TIMESTAMP_ISO8601:date}</Extended_Timestamp><OS_Process>%{INT:OS_Process}</OS_Process><Action>%{INT:Action}</Action><Returncode>%{INT:Returncode}</Returncode>"}
}

if (![Audit_Type]){
drop{}
}

date{
match => [ "date", "ISO8601" ]
}
}
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
harveyt
Posts: 3
Joined: Mon Sep 17, 2018 2:04 pm

Re: import from file

Post by harveyt »

After some discussion with our DBAs, this solution works great...with this one log file. My bad, I chose a simple file for demonstrative purposes so that I wouldn't have so much redaction to perform.

The DB audit files are auto-created by Oracle and will contain varying data i.e. privileged account logins, every command a privileged account uses, table modifications, connections, disconnects, session timesout, crash stack traces, et al. With the files being .xml, it's a situation of build-your-tag-and-they-will-come. Our other logs are much the same way, we do not tag every line with a time stamp...each log entry is associated with a timestamp, but said log entry may be many, many lines long. The logging tool would need hundreds of customized "grok" commands to deal with any log file.

First in, first out or last in first out doesn't matter, but the log data must be imported and reported in the order in which it appears in the files...a prime example is a crash stack trace. How does one follow the stack trace if it is reported "out-of-order." Arguably, once an event is triggered we could log in to the impacted system and trace the local log file. Though you don't need me reciting your marketing literature, the fact that NLS does not accurately reflect a log's contents defeats a major marketing benefit touting centralized log monitoring and "network insights".

How can we fix this so that we can comply with DISA STIGs and pass our audits?
User avatar
cdienger
Support Tech
Posts: 5045
Joined: Tue Feb 07, 2017 11:26 am

Re: import from file

Post by cdienger »

It is able to handle multiple lines and group them together as a single event. You would need to use a different input for this. For example:

tcp {
codec => multiline {
pattern => '<AuditRecord>'
negate => false
what => next
}
port => 6677
type => 'multiline'
}


You would then send the logs over wihtout using the shipper script:

cat logname.log| nc nls_ip 6677

More about multiline setups can be found at https://www.elastic.co/guide/en/logstas ... iline.html.
As of May 25th, 2018, all communications with Nagios Enterprises and its employees are covered under our new Privacy Policy.
Locked