Is there an import filter to ingest csv files.
-
Linuxlogger
- Posts: 32
- Joined: Thu Jun 23, 2016 4:33 pm
Is there an import filter to ingest csv files.
I was wondering if there is an import filter that I can use as an example to import csv files into NLS.
-
Linuxlogger
- Posts: 32
- Joined: Thu Jun 23, 2016 4:33 pm
Re: Is there an import filter to ingest csv files.
OK so I found this link:
https://www.elastic.co/guide/en/logstas ... s-csv.html
Went into NLS and created the filter:
CSV import filter
filter {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
My question is "Now that I have the filter created how do I emplement it for the forwarded csv files? How do I define if TAG = "xx" use CSV import filter.
https://www.elastic.co/guide/en/logstas ... s-csv.html
Went into NLS and created the filter:
CSV import filter
filter {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
My question is "Now that I have the filter created how do I emplement it for the forwarded csv files? How do I define if TAG = "xx" use CSV import filter.
Re: Is there an import filter to ingest csv files.
The filters usually contain some sort of conditional logic when addressing log types. For example, say I have the following input type for all my Linux audit logs:
In that Input rule I am setting the type to audit_log for all traffic on port 5545. This means logs sent over that port will have their type set to audit_log. The corresponding filter that breaks up that audit log into the fields I want to analyze looks like this:
If the file exists logically in Linux, either as a path or in a mount point, you can use the file type to define an Input rule for that file:
At which point you could define a filter to work with that type:
Code: Select all
tcp {
port => 5545
type => audit_log
}
udp {
port => 5545
type => audit_log
}Code: Select all
if [type] == 'audit_log' {
do my filtering
}
Code: Select all
input {
file {
path => "/home/user/myfile.csv"
start_position => beginning
type => my_csv_type_tag_or_something
}
}Code: Select all
if [type] == 'my_csv_type_tag_or_something' {
do csv things!
}
Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/
-
Linuxlogger
- Posts: 32
- Joined: Thu Jun 23, 2016 4:33 pm
Re: Is there an import filter to ingest csv files.
Can I use the tag to identify the stream? If so how would I do that.
Re: Is there an import filter to ingest csv files.
Assuming there is a common value in the tags, you should be able to do that:
Code: Select all
if "value" in [tags] {
do value things
}
Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/
-
Linuxlogger
- Posts: 32
- Joined: Thu Jun 23, 2016 4:33 pm
Re: Is there an import filter to ingest csv files.
So...
This should work then?
# DB2 import filter
filter {
if "DB2" in [tags] {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
}
This should work then?
# DB2 import filter
filter {
if "DB2" in [tags] {
csv {
convert => { "TIMESTAMP" => "date_time", "CATEGORY" => "string", "EVENT" => "string", "USERID" => "string", "ROWSMODIFIED" => "int", "ROWSRETURNED" => "int", "INSTNAME" => "string", "HOSTNAME" => "string" }
}
}
}
}
Re: Is there an import filter to ingest csv files.
Not quite. For events in CSV format, it's helpful to define the columns before doing anything else:
That by itself should be enough to capture all the individual fields. Depending on how TIMESTAMP is formatted, you may need to then mutate that field for it to be recognized as a proper date_time or date in elasticsearch. If elasticsearch doesn't recognize the format of TIMESTAMP, it will probably default it's type to string. Which will still make the field itself searchable, just not as a proper date or date_time.
The resulting entry from the above filter:
My TIMESTAMP is poorly formatted, so it gets processed as a string by default:
Code: Select all
if [type] == 'csv_entry' {
csv {
columns => ["TIMESTAMP", "CATEGORY", "EVENT", "USERID", "ROWSMODIFIED", "ROWSRETURNED", "INSTNAME", "HOSTNAME"]
}
}The resulting entry from the above filter:
My TIMESTAMP is poorly formatted, so it gets processed as a string by default:
You do not have the required permissions to view the files attached to this post.
Former Nagios employee
https://www.mcapra.com/
https://www.mcapra.com/