New Dedupe Transform

Shed duplicate logs

For certain use cases, log deduplication can be a useful tool. Not only does this promote your data integrity, but it can help protect against upstream mistakes that accidentally duplicate logs. This mistake can easily double (or more!) your log volume. To protect against this you can use our new dedupe transform.

Get Started

Simply add the transform to your pipeline:

  # General
  type = "dedupe" # required
  inputs = ["my-source-id"] # required

  # Fields
  fields.match = ["timestamp", "host", "message"] # optional, default
The fields.match option lets you control which fields are compared to determine if events are equal.