Page 1 of 1

A question about the output filter

Posted: Tue Jan 31, 2017 12:17 pm
by benhank
Hey guys.
I am running an experiment with Grafana. Im trying to show that NLS can do the job of grafana and a whole lot more. So I am setting up a "pepsi challenge" of sorts with NLS, the elastic's ELK stack and grafana.
I want to have my current NLS server to export say 50gb of data in json format to the remote server thats running the "other stuff, while keeping a copy on my NLS".
I dont know how to use the output filter to make it happen, any suggestions?

Re: A question about the output filter

Posted: Tue Jan 31, 2017 12:53 pm
by rkennedy
What is on the other end, another ELK stack?

If so, you should be able to use the Logstash tcp output - https://www.elastic.co/guide/en/logstas ... s-tcp.html -

Code: Select all

tcp {
host => "<elkip>"
port => "<portacceptingjson>"
codec => "json"
}
Should do the trick, this is providing you have a TCP input accepting JSON on the ELK side of things.

Re: A question about the output filter

Posted: Tue Feb 21, 2017 2:33 pm
by benhank
so on the receiving elk stack should have

Code: Select all

tcp {
    port => 9200
type= json
}
correct?

Re: A question about the output filter

Posted: Tue Feb 21, 2017 3:26 pm
by mcapra
The receiving ELK stack's Logstash input should be leveraging the JSON codec, unless you have a filter rule that matches if [type] == 'json' and then parses the message as JSON. Here's the stock JSON input we use for NLS, it should work just as well on an OSS ELK stack:

Code: Select all

tcp {
    type => 'import_json'
    tags => 'import_json'
    port => 2057
    codec => json
}

Re: A question about the output filter

Posted: Tue Feb 21, 2017 3:46 pm
by benhank
thanks man! you can lock it up