Written by Balázs SOLTÉSZ

Version 0.1

This document describes the eduteams logging infrastructure and provides a guide for adding clients to the system.

The system collects and parses log messages in a search-friendly way, utilizing the power of Logstash, Elasticsearch and Kibana. Logs are collected from hosts by Filebeat, which requires all logs to be written to file.

Filebeat sends the collected data to Logstash, which can perform various transformations and data extractions, then sends the data to the Elasticsearch cluster to be stored. Kibana can be used to query Elasticsearch in a performant and easy manner.

Note: this document is a work in progress.

Adding Clients

Adding clients is pretty easy when using our ansible library. The first step to take is specifying the Filebeat prospectors to configure on the host. A prospector is a source of log messages, defined by a filename. The ‘*’ asterisk may be used in the file path to add multiple log files to the prospector.

As a good practice, these prospectors should be set in a host-specific way, in the host_vars/hostname/vars file.

For an example, check out registry.test.eduteams.org’s config:

---
# …
filebeat_prospectors:
  - input_type: log
    paths:
      - "/var/log/*.log"
      - "/var/log/apache2/registry.test.eduteams.org_*.log"
filebeat_output_logstash_hosts:
  - "logserver.test.eduteams.org:5044"

First we define the prospectors, both of which are plain text log files. Both paths cover multiple files.

The second option overrides the default value (logserver.eduteams.org) of the filebeat_output_logstash_hosts variable.

Now that Filebeat is configured to our host, the host must be added to the ‘log-client’ group in the inventory. Don’t worry if it’s already in a group, ansible supports hosts assigned to multiple groups.

Then run the logging.yml playbook’s log-client tag.

In a perfect world, you’d then proceed to write custom filters for logstash to correctly process your logs, but the infrastructure isn’t that advanced, yet.


  • No labels