The Vector team is pleased to announce version 0.44.0!
Be sure to check out the upgrade guide for breaking changes in this release.
This release contains numerous enhancements and fixes.
aws_s3
source in this version produces many internal logs. These logs will be downgraded to debug
in the next release. Until then, you can suppress them with VECTOR_LOG=info,vector::sources::aws_s3=warn
.VRL was updated to v0.21.0. This includes the following changes:
to_unix_timestamp
, to_float
, and uuid_v7
can now return an error if the supplied timestamp is unrepresentable as a nanosecond timestamp. Previously the function calls would panic. (https://github.com/vectordotdev/vrl/pull/979)crc
function to calculate CRC (Cyclic Redundancy Check) checksumparse_cbor
function (https://github.com/vectordotdev/vrl/pull/1152)zip
function to iterate over an array of arrays and produce a new
arrays containing an item from each one. (https://github.com/vectordotdev/vrl/pull/1158)decode_charset
, encode_charset
functions to decode and encode strings between different charsets. (https://github.com/vectordotdev/vrl/pull/1162)object_from_array
function to create an object from an array of
value pairs such as what zip
can produce. (https://github.com/vectordotdev/vrl/pull/1164)1h2s
, 2m3s
) in the parse_duration
function. (https://github.com/vectordotdev/vrl/pull/1197)parse_bytes
function to parse given bytes string such as 1MiB
or 1TB
either in binary or decimal base. (https://github.com/vectordotdev/vrl/pull/1198)main
log format for parse_nginx_log
. (https://github.com/vectordotdev/vrl/pull/1202)timezone
argument in the parse_timestamp
function. (https://github.com/vectordotdev/vrl/pull/1207)aws_s3
source now logs when S3 objects are fetched. If ACKs are enabled, it also logs on delivery.
Thanks to
fdamstra
for contributing this change!log_to_metric
transformer tag key are now template-able which enables tags expansion.
Thanks to
titaneric
for contributing this change!parse_dnstap
that can parse dnstap data and produce output in the same format as dnstap
source.
Thanks to
esensar
for contributing this change!force_path_style
option to the aws_s3
sink that allows users to configure virtual host style addressing. The value defaults to true
to maintain existing behavior.
Thanks to
sam6258
for contributing this change!socket
sink now supports unix_datagram
as a valid mode
. This feature is only available on Linux.
Thanks to
jpovixwm
for contributing this change!bulk.index
field
cannot be resolved
Thanks to
ArunPiduguDD
for contributing this change!-alpine
and -distroless-static
images was updated to 3.21
.GLACIER_IR
option was added to storage_class
for aws_s3
sink.
Thanks to
MikeHsu0618
for contributing this change!kubernetes_logs
source now sets a user-agent
header when querying k8s apiserver.
Thanks to
ganelo
for contributing this change!skip_unknown_fields
setting to be optional, thereby allowing use of the defaults provided by the ClickHouse server. Setting it to true
will permit skipping unknown fields and false
will make ClickHouse strict on what fields it accepts.
Thanks to
PriceHiller
for contributing this change!ignored_header_bytes
. Previously this was using the compressed bytes. For now, only gzip compression is supported.
Thanks to
roykim98
for contributing this change!filter
transform now generates a more accurate config when generated via vector generate
by using a comparison rather than an assignment.
Thanks to
abcdam
for contributing this change!gcp_pubsub
source no longer has a 4MB message size limit.
Thanks to
sbalmos
for contributing this change!opentelemetry
sink input resolution. The sink is now using the underlying protocol to determine what inputs are accepted.
Thanks to
pront
for contributing this change!sample
transform now correctly uses the configured sample_rate_key
instead of always using "sample_rate"
.
Thanks to
dekelpilli
for contributing this change!Sign up to receive emails on the latest Vector content and new releases
Thank you for joining our Updates Newsletter