You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
input{generator{lines=>['<sample_xml_log>']}}filter{xml{source=>messagetarget=>entry}metrics{meter=>"events"add_tag=>"metric"}}output{# only emit events with the 'metric' tagif"metric"in[tags]{stdout{codec=>line{format=>"rate: %{[events][rate_1m]}"}}}}
The output for the log entry that uses CDATA section is:
I believe it's rather odd that the use of CDATA actually makes things worse. In theory the parser should just ignore and fast-forward until it finds a closing ]]>
The text was updated successfully, but these errors were encountered:
I've ran a xml filter benchmark against 2 types of xml log entries:
In both cases, this nested xml text is ~18k chars.
I've ran the benchmark in a dedicated Core i5 (4x cores) with 16GB RAM.
I've used Logstash v2.4.0.
JVM is:
Linux:
The logstash configuration is as following:
(from https://www.elastic.co/guide/en/logstash/2.4/plugins-filters-metrics.html)
The output for the log entry that uses CDATA section is:
The output for the log entry that does NOT uses CDATA section is:
I believe it's rather odd that the use of CDATA actually makes things worse. In theory the parser should just ignore and fast-forward until it finds a closing
]]>
The text was updated successfully, but these errors were encountered: