-
Notifications
You must be signed in to change notification settings - Fork 80
Closed
Description
I gathered from the maxmind website, that postal codes for the UK are being returned with the first 2-4 characters (https://dev.maxmind.com/geoip/geoip2/geoip2-city-country-csv-databases/).
Unfortunately this results in all of our parsed entries to go to the dead letter queue, as the geoip plugin seems to deem them to be in an invalid format.
- Version: 6.4.1
- Operating System: CentOS 7.4
- Config File (if you have sensitive info, please remove it):
geoip {
source => "clientip"
}
- Sample Data:
geoip�xorg.logstash.ConvertedMap�ncontinent_code�torg.jruby.RubyStringbEU�ilongitude��f�A�kpostal_code�torg.jruby.RubyStringdEC2V�lcountry_name�torg.jruby.RubyStringnUnited Kingdom�mcountry_code3�torg.jruby.RubyStringbGB�hlocation�xorg.logstash.ConvertedMap�clat�@I��N;�6clon��f�A���bip�torg.jruby.RubyStringn213.205.241.77�icity_name�torg.jruby.RubyStringfLondon�hlatitude�@I��N;�6mcountry_code2�torg.jruby.RubyStringbGB�kregion_code�torg.jruby.RubyStringcENG�kregion_name�torg.jruby.RubyStringgEngland�htimezone�torg.jruby.RubyStringmEurope/London���hclientip�torg.jruby.RubyStringn213.205.241.77
- Steps to Reproduce:
We parse our webserver logs and use the geoip filter for the clientip. All our webserver log entries end up in the dead letter queue, because the geoip plugin is throwing this error:
Could not index event to Elasticsearch. status: 400, action: ["index", {:_id=>nil, :_index=>"filebeat-6.4.1-2018.12.12", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2e831888>], response: {"index"=>{"_index"=>"filebeat-6.4.1-2018.12.12", "_type"=>"doc", "_id"=>"pnGhoWcBhV0O84dhLCIa", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [geoip.postal_code]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"EC2V\""}
Metadata
Metadata
Assignees
Labels
No labels