use logstash to send rancher container logs to ELK

Behrouz hasanbeygi
2 min readAug 5, 2018

--

first of all you need elk stack in your rancher cluster i prefer my catalog ( docker-compose and rancher compose ) instead of official or community , you can customize it for yourself if you can read farsi here i wrote about it before.

in fact docker logging is not about rancher or any orchestration tools you can use this tutorial for swarm or kubernetes or etc …

flow of log centerilizing

docker use log driver for save or send json formated log , in this tutorial i use gelf input or log driver to collect logs in logstash and send it to elasticsearch.

so if you have a proper elasticsearch cluster you need a logstash container for each stack based your need , you can use multiple pipelines but its better to allocate on logstash and pipeline for each stack .

for configuring logstash container you have two way

one you mount pipelines into logstash with shated or local storage driver

like this

or make your logstash container and push it to your registry like this

and for logstash.conf in you pipeline you can use this for nginx access and error log and parse it with grok and geoip

in last step you need link your logstash to master elasticsearch node and in security/host configuration in rancher cattle set this parameter , just change local ip address of logstash , in addition you can add basic auth and extra tag option into your log

now in kibana you can define an index and visualize your data

--

--