Failed to flush outgoing items
Posted: Mon Jan 05, 2015 2:51 pm
Hello all,
I am getting following errors in my logstash which breaks the transport pipe
{:timestamp=>"2014-12-30T20:15:35.274000-0500", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>#<Encoding::UndefinedConversionError: ""\x80"" from ASCII-8BIT to UTF-8>,
:backtrace=>["org/jruby/RubyString.java:7575:in `encode'", "json/ext/GeneratorMethods.java:71:in `to_json'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:100:in
`bulk_ftw'", "org/jruby/RubyArray.java:2404:in `collect'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:97:in `bulk_ftw'",
"/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:80:in `bulk'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:315:in `flush'",
"/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1339:in `each'",
"/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216:in `buffer_flush'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:193:in
`buffer_flush'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:159:in `buffer_receive'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:311:in
`receive'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:86:in `handle'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:78:in `worker_setup'"], :level=>:warn}
From what I read up, the issue is most likely due to logstash and elastic search compatibility issue?
Someone suggested alternative to add Ruby Filter
ruby {
code => "begin; if !event['message'].nil?; event['message'] = event['message'].force_encoding('ASCII-8BIT').encode('UTF-8', :invalid => :replace, :undef => :replace, :replace => '?'); end; rescue; end;"
}
I am getting following errors in my logstash which breaks the transport pipe
{:timestamp=>"2014-12-30T20:15:35.274000-0500", :message=>"Failed to flush outgoing items", :outgoing_count=>1, :exception=>#<Encoding::UndefinedConversionError: ""\x80"" from ASCII-8BIT to UTF-8>,
:backtrace=>["org/jruby/RubyString.java:7575:in `encode'", "json/ext/GeneratorMethods.java:71:in `to_json'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:100:in
`bulk_ftw'", "org/jruby/RubyArray.java:2404:in `collect'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:97:in `bulk_ftw'",
"/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch/protocol.rb:80:in `bulk'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:315:in `flush'",
"/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:219:in `buffer_flush'", "org/jruby/RubyHash.java:1339:in `each'",
"/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:216:in `buffer_flush'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:193:in
`buffer_flush'", "/usr/local/nagioslogserver/logstash/vendor/bundle/jruby/1.9/gems/stud-0.0.17/lib/stud/buffer.rb:159:in `buffer_receive'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/elasticsearch.rb:311:in
`receive'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:86:in `handle'", "/usr/local/nagioslogserver/logstash/lib/logstash/outputs/base.rb:78:in `worker_setup'"], :level=>:warn}
From what I read up, the issue is most likely due to logstash and elastic search compatibility issue?
Someone suggested alternative to add Ruby Filter
ruby {
code => "begin; if !event['message'].nil?; event['message'] = event['message'].force_encoding('ASCII-8BIT').encode('UTF-8', :invalid => :replace, :undef => :replace, :replace => '?'); end; rescue; end;"
}