The guide on parsing CSV logs with Lua describes how to parse CSV logs containing values which do not contain line breaks. However, according to RFC 4180, CSV values enclosed in double quotes can contain line breaks. This means that parsing arbitrary CSV logs requires handling such lines breaks correctly.
For general case, this cannot be accomplished using the
multiline
option of the file
source because it uses regular expressions
for delimiting lines, and for the given use case a full-fledged CSV parser is necessary.
A Minimal Example
It is possible to implement merging of CSV log lines using the same lua-csv
module as
in the guide on parsing CSV logs. The underlying algorithm is the following
following:
- Parse incoming log line as a CSV row.
- Check the number of fields in it.
- If the number of fields matches the expected number of fields, then the log line contains all necessary fields and can be processed further.
- Otherwise, store the log line in the state of the transform and then, when the next event comes, merge the subsequent log line with the previous ones and repeat parsing again.
Such an algorithm can be implemented, for example, with the following transform config:
[transforms.lua]inputs = []type = "lua"version = "2"source = """csv = require("csv") -- load the `lua-csv` moduleexpected_columns = 23 -- expected number of columns in incoming CSV linesline_separator = "\\r\\n" -- note the double escaping required by the TOML format"""hooks.process = """function (event, emit)merged_event = merge(event)if merged_event == nil then -- a global variable containing the merged eventmerged_event = event -- if it is empty, set it to the current eventelse -- otherwise, concatenate the line in the stored merged event-- with the next linemerged_event.log.message = merged_event.log.message ..line_separator .. event.log.messageendfields = csv.openstring(event.log.message):lines()() -- parse CSVif #fields < expected_columns thenreturn -- not all fields are present in the merged event yetend-- do something with the array of the parsed fieldsmerged_event.log.csv_fields = fields -- for example, just store them in an-- array fieldemit(merged_event) -- emit the resulting eventmerged_event = nil -- clear the merged eventend"""
How It Works
The merging process can be represented using the following diagram:
The lua
transform has internal state, which can be accessed and modified from user-defined code
using global variables. Initially, the state is empty, which corresponds to merged_event
variable
being set to nil
.
As events arrive to the transform, they cause the merged_event
variable to hold an aggregated
event, thus making the event non-empty.
In the end, when the state holds enough data to extract all fields, a merged event is emitted and the state is emptied. Then the process repeats as new events arrive.
Safety Checks
The merging algorithm used above is simple and would work for data coming from trusted sources. However,
in general case it might happen that the CSV is malformed, so that some field is not terminated by "
,
which can cause unbounded growth of the message
field. In order to prevent this, it is possible to replace
the following lines
merged_event.log.message = merged_event.log.message ..line_separator .. event.log.message
in the definition of the process
hook by this code:
merged_event = safe_merge(merged_event, event)if not merged_event thenreturnend
and add the following definition of the safe_merge
function to the source
section of the config:
function safe_merge(merged_event, event)if #merged_event.log.message + #event.log.message > 4096 thenreturn nilelsemerged_event.log.message = merged_event.log.message ..line_separator .. event.log.messagereturn merged_eventendend
This function checks whether the total length of merged lines not larger than 4096 (the actual value can be made larger if it is necessary by a particular use case) and, if that is the case, performs actual merging.
In general, it is recommended to always add such safety checks to the code of your custom transforms in order to ensure that malformed input would not cause memory leaks or other kinds of undesired behavior.
Further Steps
After the problem of merging multi-line logs in custom formats is solved, you might be interested in checking out the following guides: