Published on

Promtail Pipeline for Caddy Logs

Authors

This post shows how to ingest Caddy logs with Promtail and ship them to Loki so you can query and visualize them in Grafana. We’ll configure a scrape job and a set of pipeline stages to parse and enrich logs.

1) Caddy log format

Caddy can log in different formats. Two common choices:

  • logfmt (key=value pairs)
  • JSON

Example Caddy config (JSON output):

{
  "logging": {
    "logs": {
      "default": {
        "level": "INFO",
        "encoder": {
          "format": "json"
        },
        "writer": {
          "filename": "/var/log/caddy/access.log"
        }
      }
    }
  }
}

If you prefer logfmt, set encoder format to logfmt accordingly.

2) Promtail scrape config

In promtail.yaml define a job for the Caddy log path and set some base labels:

server:
  http_listen_port: 9080
  grpc_listen_port: 0

positions:
  filename: /tmp/positions.yaml

clients:
  - url: http://loki:3100/loki/api/v1/push

scrape_configs:
  - job_name: caddy
    static_configs:
      - targets: [localhost]
        labels:
          job: caddy
          __path__: /var/log/caddy/access.log

3) Pipeline stages (JSON)

If Caddy writes JSON logs, parse them and promote useful fields to labels:

pipeline_stages:
  - json:
      expressions:
        ts: time
        level: level
        logger: logger
        msg: msg
        request: request
        status: status
        method: method
        uri: uri
        ip: remote_ip
        user_agent: user_agent
        duration: duration
  - labels:
      level:
      status:
      method:
      logger:
  - timestamp:
      source: ts
      format: RFC3339
  - output:
      source: msg

Notes:

  • json extracts fields.
  • labels turns selected fields into labels (good for queries like {job="caddy",status="404"}).
  • timestamp uses the original log time.
  • output sets the short message for log lines.

4) Pipeline stages (logfmt)

If you use logfmt in Caddy, the parsing stage changes:

pipeline_stages:
  - logfmt:
      mapping:
        ts: time
        level: level
        method: method
        status: status
        uri: request_uri
        ip: remote_ip
        ua: user_agent
        duration: duration
  - labels:
      level:
      status:
      method:
  - timestamp:
      source: ts
      format: RFC3339

Adjust the keys to match your actual Caddy fields.

5) Validate and run

  • Start Loki and Grafana
  • Start Promtail with the config
  • Generate some traffic through Caddy
  • In Grafana’s Explore, pick Loki and try queries like:
    • {job="caddy"}
    • {job="caddy",status="404"}
    • {job="caddy"} |= "POST"

6) Tips

  • Keep label cardinality in check; only promote low‑cardinality fields (status, method).
  • Use drop or match stages to reduce noise.
  • Consider multi‑file targets if you rotate logs (e.g. /var/log/caddy/*.log).

With this setup, Caddy logs are parsed, labeled, and shipped to Loki for powerful analysis and dashboards in Grafana.