New Vector version 0.32.2

A lightweight, ultra-fast tool for building observability pipelines

Take control of your observability data

Collect, transform, and route all your logs and metrics with one simple tool.

Why Vector?

Ultra fast and reliable
Built in Rust, Vector is blistering fast, memory efficient, and designed to handle the most demanding workloads.
End to end
Vector strives to be the only tool you need to get observability data from A to B, deploying as a daemon, sidecar, or aggregator.
Vector supports logs and metrics, making it easy to collect and process all your observability data.
Vendor neutral
Vector doesn’t favor any specific vendor platforms and fosters a fair, open ecosystem with your best interests in mind. Lock-in free and future proof.
Programmable transforms
Vector’s highly configurable transforms give you the full power of programmable runtimes. Handle complex use cases without limitation.
Clear guarantees
Guarantees matter, and Vector is clear on which guarantees it provides, helping you make the appropriate trade-offs for your use case.

A complete, end-to-end platform.

Deploy Vector in a variety of roles to suit your use case.
Get data from point A to point B without patching tools together.

Learn more about the distributed deployment topology for Vector

Learn more about the centralized deployment topology for Vector

Learn more about the stream-based deployment topology for Vector

Easy to configure

A simple, composable format enables you to build flexible pipelines

type = "datadog_agent"
address = ""

type = "remap"
inputs = ["datadog_agent"]
source = '''
  redact(., filters: ["us_social_security_number"])

type = "datadog_logs"
inputs = ["remove_sensitive_user_info"]
default_api_key = "${DATADOG_API_KEY}"
type = "kafka"
bootstrap_servers = ","
group_id = "vector-logs"
key_field = "message"
topics = ["logs-*"]

type = "remap"
inputs = ["kafka_in"]
source = '''
  parsed, err = parse_json(.message)
  if err != null {
	log(err, level: "error")
  . |= object(parsed) ?? {}

type = "elasticsearch"
inputs = ["json_parse"]
endpoint = ""
index = "logs-via-kafka"
type = "kubernetes_logs"

type = "aws_s3"
inputs = ["k8s_in"]
bucket = "k8s-logs"
region = "us-east-1"
compression = "gzip"
encoding.codec = "json"
type = "splunk_hec"
address = ""
token = "${SPLUNK_HEC_TOKEN}"

type = "datadog_logs"
inputs = ["splunk_hec_in"]
default_api_key = "${DATADOG_API_KEY}"
Configuration examples are in TOML but Vector also supports YAML and JSON

Installs everywhere

Packaged as a single binary. No dependencies, no runtime, and memory safe.

Single binary
X86_64, ARM64/v7
No runtime
Memory safe

Install with a one-liner:

curl --proto '=https' --tlsv1.2 -sSf | bash
curl --proto '=https' --tlsv1.2 -sSf | bash -s -- -y

Or choose your preferred method:

Highly flexible processing topologies

A wide range of sources, transforms, and sinks to choose from

Backed by a strong open source community

13k+ GitHub stars
300+ Contributors
30m+ Downloads
40 Countries