I'm trying to use logstash to filter mongo logs and I'm having a problem with repeated fields...
This is my logstash input configuration:
filter { if [type] == "mongo" { grok { match => { "message" => "%{TIMESTAMP_ISO8601:timestamp}%{SPACE}%{MONGO3_SEVERITY:severity}%{SPACE}%{MONGO3_COMPONENT:component}%{SPACE}\[%{WORD:context}\]%{SPACE}%{GREEDYDATA:logmessage}" } overwrite => [ "timestamp", "severity", "component", "context", "logmessage", " break_on_match => true } } }
The thing is I'm getting duplicate results, for example:
logmessage: Successfully authenticated as principal admin on admin, Successfully authenticated as principal admin on admin severity: I, I component: ACCESS, ACCESS timestamp: 2016-08-04T15:07:28.562+0000, 2016-08-04T15:07:28.562+0000
Does anyone have any idea why this happens? I tried with and without break_on_match, and with and without overwrite...
Thanks!
Edit: Sample messages
2016-08-04T15:07:28.562+0000 I ACCESS [conn28820] Successfully authenticated as principal admin on admin 2016-08-04T17:27:48.531+0000 I NETWORK [initandlisten] connection accepted from XXX.XXX.XXX.XXX:XXXX #29101 (11 connections now open) 2016-08-04T17:26:28.565+0000 I ACCESS [conn29098] Successfully authenticated as principal admin on admin
[link] [comments]