Home:ALL Converter>How can I send and aggregate multiple docker container's logs to journald?

How can I send and aggregate multiple docker container's logs to journald?

Ask Time:2019-09-24T20:17:44         Author:Chris Stryczynski

Json Formatter

I'm running multiple containers that contain Apache. I'd like all these specific set of containers to log their log output to a single location - either a file or - or possibly journald?

Just some way in which I can aggregate their logs together - to be viewed together


I'm not looking for a heavy solution like fluentd / ELK stack.

How can I achieve the above? Currently all the containers are logging out to /dev/stdout and hence get collected in the 'docker logs'. But these are don't seem possible to aggregate together.

According to Save docker-compose logs to a file it seems I might be able to set a 'log path' - but how? Logging driver? And can this log file be shared between multiple containers?

Is the systemd logging driver a suitable option?


So I've had some luck with the journald logging driver. I've set some labels on a container like so:

version: "3"
services:
  nginx-lb:
    labels:
      - "node_service=nginx"
    logging:
      driver: "journald"
      options:
        labels: "node_service=nginx"
    restart: always
    network_mode: host
    build: .
    ports:
      - "80:80"
      - "443:443"

But now, how do I filter by these lables when viewing them with journalctl?

Here is an example journald entry generated:

{ "__CURSOR" : "s=b300aa41db4946f1bcc528e2522627ce;i=1087c;b=e6decf90a91f40c2ad7507e342fda85a;m=8744b1cdfa;t=5934bdb103a24;x=9ba66ecb768eb67", "__REALTIME_TIMESTAMP" : "1569328890657316", "__MONOTONIC_TIMESTAMP" : "580973088250", "_BOOT_ID" : "e6decf90a91f40c2ad7507e342fda85a", "_MACHINE_ID" : "c1339882251041f48f4612e758675ff3", "_HOSTNAME" : "staging", "PRIORITY" : "6", "_UID" : "0", "_GID" : "0", "_CAP_EFFECTIVE" : "3fffffffff", "_SELINUX_CONTEXT" : "unconfined\n", "_SYSTEMD_SLICE" : "system.slice", "_TRANSPORT" : "journal", "_PID" : "3969", "_COMM" : "dockerd", "_EXE" : "/usr/bin/dockerd", "_CMDLINE" : "/usr/bin/dockerd -H fd:// --containerd=/run/containerd/containerd.sock", "_SYSTEMD_CGROUP" : "/system.slice/docker.service", "_SYSTEMD_UNIT" : "docker.service", "_SYSTEMD_INVOCATION_ID" : "9f1488b462ae478a84bec6e64d72886b", "CONTAINER_NAME" : "3b9b51b4cda1a1e3b21a01f6fe80c7748fb3d231_apache_1", "CONTAINER_TAG" : "497b2f965b76", "SYSLOG_IDENTIFIER" : "497b2f965b76", "CONTAINER_ID" : "497b2f965b76", "CONTAINER_ID_FULL" : "497b2f965b767f897786f3bb8c4789dd91db1a91fe34e5ede368172f44fb3aac", "MESSAGE" : "192.168.240.1 - - [24/Sep/2019:12:41:30 +0000] \"GET / HTTP/1.0\" 200 2697 \"-\" \"curl/7.58.0\"", "_SOURCE_REALTIME_TIMESTAMP" : "1569328890657297" }

Author:Chris Stryczynski,eproduced under the CC 4.0 BY-SA copyright license with a link to the original source and this disclaimer.
Link to original article:https://stackoverflow.com/questions/58080170/how-can-i-send-and-aggregate-multiple-docker-containers-logs-to-journald
Chris Stryczynski :

I instead used the tag logging option.\n\nversion: \"3\"\nservices:\n nginx-lb:\n labels:\n - \"node_service=nginx\"\n logging:\n driver: \"journald\"\n options:\n labels: \"node_service=nginx\"\n tag: \"nginx\"\n restart: always\n network_mode: host\n build: .\n ports:\n - \"80:80\"\n - \"443:443\"\n\n\nAnd then to view / filter:\n\njournalctl CONTAINER_TAG=nginx --since \"1 hour ago\"\n",
2019-09-24T13:30:34
yy