Send logs from Kafka to anywhere

A simple guide to send logs from Kafka to anywhere in just a few minutes.
type: tutorialdomain: sourcessource: kafka

Logs are an essential part of observing any service; without them you'll have significant blind spots. But collecting and analyzing them can be a real challenge -- especially at scale. Not only do you need to solve the basic task of collecting your logs, but you must do it in a reliable, performant, and robust manner. Nothing is more frustrating than having your logs pipeline fall on it's face during an outage, or even worse, cause the outage!

Fear not! In this guide we'll build an observability pipeline that will send logs from Kafka to anywhere.

Background

What is Kafka?

Apache Kafka is an open-source project for a distributed publish-subscribe messaging system rethought as a distributed commit log. Kafka stores messages in topics that are partitioned and replicated across multiple brokers in a cluster. Producers send messages to topics from which consumers read. These features make it an excellent candidate for durably storing logs and metrics data.

Strategy

How This Guide Works

We'll be using Vector to accomplish this task. Vector is a popular open-source observability data platform. It's written in Rust, making it lightweight, ultra-fast and highly reliable. And we'll be deploying Vector as a agent.

Vector daemon deployment strategyVector daemon deployment strategy
1. Your service logs to STDOUT
STDOUT follows the 12 factor principles.
2. STDOUT is captured
STDOUT is captured and sent to Kafka topics.
3. Vector collects & fans-out data
Vector collects data from your platform.

What We'll Accomplish

We'll build an observability data platform that:

  • Collects logs from Kafka.
    • Enriches data with useful Kafka context.
    • Efficiently collects data and checkpoints read positions to ensure data is not lost between restarts.
    • Securely collects data via Transport Layer Security (TLS).
  • Send your logs to one or more destinations

All in just a few minutes!

Tutorial

  1. Install Vector

    curl --proto '=https' --tlsv1.2 -sSf https://sh.vector.dev | sh
  2. Configure Vector

    cat <<-'VECTORCFG' > ./vector.toml
    [sources.kafka]
    type = "kafka"
    bootstrap_servers = "10.14.22.123:9092,10.14.23.332:9092"
    group_id = "consumer-group-name"
    topics = [ "^(prefix1|prefix2)-.+", "topic-1", "topic-2" ]
    # --> Add transforms here to parse, enrich, and process data
    # print all events, replace this with your desired sink(s)
    # https://vector.dev/docs/reference/sinks/
    [sinks.out]
    type = "console"
    inputs = [ "kafka" ]
    encoding.codec = "json"
    VECTORCFG
  3. Start Vector

    vector --config ./vector.toml
  4. Observe Vector

    vector top
    explain this command

Next Steps

Vector is powerful tool and we're just scratching the surface in this guide. Here are a few pages we recommend that demonstrate the power and flexibility of Vector:

Vector Github repo 4k
Vector is free and open-source!
Vector getting started series
Get setup in just a few minutes
Vector documentation
Everything you need to know about Vector