Hi All, This document shows you how to push docker container logs to elk stack.
ELK is Elasticsearch, Logstash, and Kibana. ELK is a log analysis platform where users can analyze, visualize, and search.
when a docker container runs there are some logs are generated based on the application which runs in the container. These logs can be viewed using docker logs <continer-ID>. you can see the logs of the container using the docker service logs service name if containers are deployed using docker service.
docker logs <containerID> shows the STDOUT and STDERR outputs which are usually shown in the terminal.
When this container dies the logs generated by the container get disappear. In some cases, you might need to store container logs for future analysis. In this case, it will be better the logs are stored in ELK.
If your application that runs inside the container supports log pushing to ELK, you don’t need to approach this way.
For example, in Spring boot using logback.xml has the capability to push logs to logstash from there it taken forward to Elasticsearch by logstash.
So here I will show how to push the logs to elk
where 184.108.40.206 is the logstash IP and 5000 is the input port by logstash.
the driver is Syslog instead of JSON-file the default one.
Below is the logstash.conf file to accept the connection from the docker and push the logs to elasticsearch.
where 220.127.116.11 is the elasticsearch IP address. You can add filters if you wish to do some modification to the incoming data from the docker.
Make sure you have made the logstash service UP before starting the docker.
Below is the example of a docker stack file that performs the logs pushing to ELK. It does for the n number of replicas of docker containers.
© 2020, Techrunnr. All rights reserved.