Send logs from Docker to AWS S3

A simple guide to send logs from Docker to AWS S3 in just a few minutes.
type: tutorialdomain: platformsdomain: sinksplatform: dockersource: dockersink: aws_s3

Logs are an essential part of observing any service; without them you are flying blind. But collecting and analyzing them can be a real challenge -- especially at scale. Not only do you need to solve the basic task of collecting your logs, but you must do it in a reliable, performant, and robust manner. Nothing is more frustrating than having your logs pipeline fall on it's face during an outage, or even worse, disrupt more important services!

Fear not! In this guide we'll show you how to send send logs from Docker to AWS S3 and build a logs pipeline that will be the backbone of your observability strategy.

Background

What is Docker?

Docker is an open platform for developing, shipping, and running applications and services. Docker enables you to separate your services from your infrastructure so you can ship quickly. With Docker, you can manage your infrastructure in the same ways you manage your services. By taking advantage of Docker’s methodologies for shipping, testing, and deploying code quickly, you can significantly reduce the delay between writing code and running it in production.

What is AWS S3?

Amazon Simple Storage Service (Amazon S3) is a scalable, high-speed, web-based cloud storage service designed for online backup and archiving of data and applications on Amazon Web Services. It is very commonly used to store log data.

Strategy

How This Guide Works

We'll be using Vector to accomplish this task. Vector is a popular open-source utility for building observability pipelines. It's written in Rust, making it lightweight, ultra-fast and highly reliable. And we'll be deploying Vector as a daemon.

The daemon deployment strategy is designed for data collection on a single host. Vector runs in the background, in its own process, collecting all data for that host. For this guide, Vector will collect data from Docker via Vector's docker. The following diagram demonstrates how it works.

Vector daemon deployment strategyVector daemon deployment strategy
1. Your service logs to STDOUT
STDOUT follows the 12 factor principles.
2. STDOUT is captured
STDOUT is captured and sent to Docker.
3. Vector collects & fans-out data
Vector will send logs to AWS S3.

What We'll Accomplish

To be clear, here's everything we'll accomplish in this short guide:

  • Collect Docker container logs.
    • Filter which containers you collect them from.
    • Automatically merge logs that Docker splits.
    • Enrich your logs with useful Docker context.
  • Send logs to AWS S3.
    • Dynamically partition logs across different key prefixes.
    • Compress and batch data to reduce storage cost and imrpove throughput.
    • Optionally adjust ACL and encryption settings.
    • Automatically retry failed requests, with backoff.
    • Buffer your data in-memory or on-disk for performance and durability.
  • All in just a few minutes!

Tutorial

  1. Configure Vector

    cat <<-VECTORCFG > /etc/vector/vector.toml
    [sources.in]
    type = "docker" # required
    [sinks.out]
    bucket = "my-bucket" # required
    inputs = ["in"] # required
    region = "us-east-1" # required, required when endpoint = ""
    type = "aws_s3" # required
    VECTORCFG
    explain this command
  2. Start the Vector container

    docker run \
    -v $PWD/vector.toml:/etc/vector/vector.toml:ro \
    -v /var/run/docker.sock:/var/run/docker.sock \
    timberio/vector:latest-alpine
    explain this command

    That's it! Simple and to the point. Hit ctrl+c to exit.

Next Steps

Vector is powerful utility and we're just scratching the surface in this guide. Here are a few pages we recommend that demonstrate the power and flexibility of Vector:

Vector Github repo 4k
Vector is free and open-source!
Vector getting started series
Go from zero to production in under 10 minutes!
Vector documentation
Thoughtful, detailed docs that respect your time.