syslog-ng Open Source Edition 3.30 - Administration Guide

Preface Introduction to syslog-ng The concepts of syslog-ng Installing syslog-ng The syslog-ng OSE quick-start guide The syslog-ng OSE configuration file source: Read, receive, and collect log messages
How sources work default-network-drivers: Receive and parse common syslog messages internal: Collecting internal messages file: Collecting messages from text files wildcard-file: Collecting messages from multiple text files linux-audit: Collecting messages from Linux audit logs network: Collecting messages using the RFC3164 protocol (network() driver) nodejs: Receiving JSON messages from nodejs applications mbox: Converting local email messages to log messages osquery: Collect and parse osquery result logs pipe: Collecting messages from named pipes pacct: Collecting process accounting logs on Linux program: Receiving messages from external applications python: writing server-style Python sources python-fetcher: writing fetcher-style Python sources snmptrap: Read Net-SNMP traps sun-streams: Collecting messages on Sun Solaris syslog: Collecting messages using the IETF syslog protocol (syslog() driver) system: Collecting the system-specific log messages of a platform systemd-journal: Collecting messages from the systemd-journal system log storage systemd-syslog: Collecting systemd messages using a socket tcp, tcp6, udp, udp6: Collecting messages from remote hosts using the BSD syslog protocol— OBSOLETE unix-stream, unix-dgram: Collecting messages from UNIX domain sockets stdin: Collecting messages from the standard input stream
destination: Forward, send, and store log messages
amqp: Publishing messages using AMQP collectd: sending metrics to collectd elasticsearch2: Sending messages directly to Elasticsearch version 2.0 or higher (DEPRECATED) elasticsearch-http: Sending messages to Elasticsearch HTTP Bulk API file: Storing messages in plain-text files graphite: Sending metrics to Graphite Sending logs to Graylog hdfs: Storing messages on the Hadoop Distributed File System (HDFS) Posting messages over HTTP http: Posting messages over HTTP without Java kafka: Publishing messages to Apache Kafka (Java implementation) kafka: Publishing messages to Apache Kafka (C implementation, using the librdkafka client) loggly: Using Loggly logmatic: Using Logmatic.io mongodb: Storing messages in a MongoDB database network: Sending messages to a remote log server using the RFC3164 protocol (network() driver) osquery: Sending log messages to osquery's syslog table pipe: Sending messages to named pipes program: Sending messages to external applications pseudofile() python: writing custom Python destinations redis: Storing name-value pairs in Redis riemann: Monitoring your data with Riemann slack: Sending alerts and notifications to a Slack channel smtp: Generating SMTP messages (email) from logs snmp: Sending SNMP traps Splunk: Sending log messages to Splunk sql: Storing messages in an SQL database stomp: Publishing messages using STOMP Sumo Logic destinations: sumologic-http() and sumologic-syslog() syslog: Sending messages to a remote logserver using the IETF-syslog protocol syslog-ng(): Forward logs to another syslog-ng node tcp, tcp6, udp, udp6: Sending messages to a remote log server using the legacy BSD-syslog protocol (tcp(), udp() drivers) Telegram: Sending messages to Telegram unix-stream, unix-dgram: Sending messages to UNIX domain sockets usertty: Sending messages to a user terminal: usertty() destination Write your own custom destination in Java or Python Client-side failover
log: Filter and route log messages using log paths, flags, and filters Global options of syslog-ng OSE TLS-encrypted message transfer template and rewrite: Format, modify, and manipulate log messages parser: Parse and segment structured messages db-parser: Process message content with a pattern database (patterndb) Correlating log messages Enriching log messages with external data Statistics of syslog-ng Multithreading and scaling in syslog-ng OSE Troubleshooting syslog-ng Best practices and examples The syslog-ng manual pages Creative Commons Attribution Non-commercial No Derivatives (by-nc-nd) License Glossary

The Python HTTP header plugin

This section describes the syslog-ng Open Source Edition (syslog-ng OSE) application's Python HTTP header plugin.

For more information about modules in syslog-ng OSE, see Modules in syslog-ng Open Source Edition (syslog-ng OSE).

The syslog-ng OSE application supports adding custom headers to HTTP requests using the Python programming language.

Prerequisites

NOTE: Before you use the python-http-header plugin, make sure that your syslog-ng OSE appliance was compiled with Python support. If you installed syslog-ng OSE from a package, make sure that the subpackage containing Python support is also installed.

Configuration
destination d_http {
  http(
    python_http_header(
      class("<class-name>")
      options("options-key1", "option-value1")
      options("options-key2", "option-value2")
      mark-errors-as-critical(no))
    url("http://127.0.0.1:8888")
  );
};

Options used in the configuration

  • class: Mandatory option. It refers to the user's Python class that implements the python-http-header interface. It can be mymodule.MyClass if the class MyClass is put into a mymodule.py module, or simply MyClass if the user's code is provided inline in the configuration, using the python { ... }; keyword.

    NOTE: If you put the class implementation into its own module, it should be put into a standard location, or made available with the PYTHONPATH environment variable.

  • options("key" "value"): Optional option. Multiple options can be specified at the same time. The syslog-ng OSE application will build a Python dictionary, which will be available in the __init__ method.
  • mark-errors-as-critical(yes|no): Optional option. Its default value is yes. In case there is a Python error, this parameter decides if the HTTP destination will still try to send the request with the failed headers, or disconnect instead.
Defining the python-http-header() interface

You can define the Python interface with the following:

class TestCounter():
  def __init__(self, options):
    self.key = options["value"]

  def get_headers(self, body, headers):
    return ["header1: value1", "header2: value2"]

  def on_http_response_received(self, http_code):
    print("HTTP response code received: {}".format(http_code))

By default, when the signal_http_header_request is emitted by the HTTP module, the connected slot automatically executes the Python code.

NOTE: If the plugin fails, the HTTP module does not send the HTTP request without the header items by default. If you want the HTTP module to try sending the request without the header items, disable the mark-errors-as-critical function.

Methods used in the configuration

  • __init__(self, options): Optional method. The options specified in the syslog-ng OSE configuration can be stored in the instance using this method.
  • get_headers(self, body, headers): Mandatory method. Returns a list of strings of form ["header: value", ...]. The returned headers will be set for the outgoing HTTP request. The body contains the body of the HTTP request. The headers contain the current headers that the HTTP destination has already added to the request.
  • on_http_response_received(self, http_code): Optional method. If specified, syslog-ng OSE inserts the http_code of the previous response. This can be used to handle error (for example, for recreating auth headers, or dropping cache).
Example configuration for using the Python HTTP header plugin

The following example can be copy-pasted and used as a template for using the Python HTTP header plugin in your configuration.

python {
from syslogng import Logger
					
logger = Logger()
					
class TestCounter():
    def __init__(self, options):
        self.header = options["header"]
        self.counter = int(options["counter"])
        logger.debug(f"TestCounter class instantiated; options={options}")
					
    def get_headers(self, body, headers):
        logger.debug(f"get_headers() called, received body={body}, headers={headers}")
					
        response = ["{}: {}".format(self.header, self.counter)]
        self.counter += 1
        return response
					
    def on_http_response_received(self, http_code):
        self.counter += http_code
        logger.debug("HTTP response code received: {}".format(http_code))
					
    def __del__(self):
        logger.debug("Deleting TestCounter class instance")
};
					
source s_network {
  network(port(5555));
};
					
destination d_http {
    http(
        python_http_header(
            class("TestCounter")
            options("header", "X-Test-Python-Counter")
            options("counter", 11)
            # this means that syslog-ng will keep trying to send the http request even when this module fails
            mark-errors-as-critical(no)
        )
        url("http://127.0.0.1:8888")
    );
};
					
log {
    source(s_network);
    destination(d_http);
    flags(flow-control);
};

Caution:

Although it is possible to configure multiple HTTP workers for syslog-ng OSE, the syslog-ng OSE application can only embed a single Python interpreter at the same time. As a result, if you configure more than one HTTP workers on your syslog-ng OSE application, the Python code will run in concurrent mode. To protect the state of the object, you may need to use locks.

For more information about using locks, see Introduction to the Python HTTP header.


Was this topic helpful?

[Select Rating]



kafka: Publishing messages to Apache Kafka (Java implementation)

Starting with version 3.7, syslog-ng OSE can directly publish log messages to the Apache Kafka message bus, where subscribers can access them.

  • This destination is only supported on the Linux platform.

  • Since syslog-ng OSE uses the official Java Kafka producer, the kafka destination has significant memory usage.

  • The log messages of the underlying client libraries are available in the internal() source of syslog-ng OSE.

Declaration:
@include "scl.conf"

kafka(
    client-lib-dir("/opt/syslog-ng/lib/syslog-ng/java-modules/:<path-to-preinstalled-kafka-libraries>")
    kafka-bootstrap-servers("1.2.3.4:9092,192.168.0.2:9092")
    topic("${HOST}")

);
Example: Sending log data to Apache Kafka

The following example defines a kafka destination, using only the required parameters.

@include "scl.conf"

destination d_kafka {
  kafka(
    client-lib-dir("/opt/syslog-ng/lib/syslog-ng/java-modules/KafkaDestination.jar:/usr/share/kafka/lib/")
    kafka-bootstrap-servers("1.2.3.4:9092,192.168.0.2:9092")
    topic("${HOST}")
  );
};

The kafka() driver is actually a reusable configuration snippet configured to receive log messages using the Java language-binding of syslog-ng OSE. For details on using or writing such configuration snippets, see Reusing configuration blocks. You can find the source of the kafka configuration snippet on GitHub. For details on extending syslog-ng OSE in Java, see the Getting started with syslog-ng development guide.

NOTE: If you delete all Java destinations from your configuration and reload syslog-ng, the JVM is not used anymore, but it is still running. If you want to stop JVM, stop syslog-ng and then start syslog-ng again.


Was this topic helpful?

[Select Rating]



Prerequisites

To publish messages from syslog-ng OSE to Apache Kafka, complete the following steps.

Steps:
  1. If you want to use the Java-based modules of syslog-ng OSE (for example, the Elasticsearch, HDFS, or Kafka destinations), you must compile syslog-ng OSE with Java support.

    • Download and install the Java Runtime Environment (JRE), 1.7 (or newer). You can use OpenJDK or Oracle JDK, other implementations are not tested.

    • Install gradle version 2.2.1 or newer.

    • Set LD_LIBRARY_PATH to include the libjvm.so file, for example:LD_LIBRARY_PATH=/usr/lib/jvm/java-7-openjdk-amd64/jre/lib/amd64/server:$LD_LIBRARY_PATH

      Note that many platforms have a simplified links for Java libraries. Use the simplified path if available. If you use a startup script to start syslog-ng OSE set LD_LIBRARY_PATH in the script as well.

    • If you are behind an HTTP proxy, create a gradle.properties under the modules/java-modules/ directory. Set the proxy parameters in the file. For details, see The Gradle User Guide.

  2. Download the latest stable binary release of the Apache Kafka libraries (version 0.9 or newer) from http://kafka.apache.org/downloads.html.

  3. Extract the Apache Kafka libraries into a single directory. If needed, collect the various .jar files into a single directory (for example, /opt/kafka/lib/) where syslog-ng OSE can access them. You must specify this directory in the syslog-ng OSE configuration file.

  4. Check if the following files in the Kafka libraries have the same version number: slf4j-api-<version-number>.jar, slf4j-log4j12-<version-number>.jar. If the version number of these files is different, complete the following steps:

    1. Delete one of the files (for example, slf4j-log4j12-<version-number>.jar).

    2. Download a version that matches the version number of the other file (for example, 1.7.6) from the official SLF4J distribution.

    3. Copy the downloaded file into the directory of your Kafka library files (for example, /opt/kafka/lib/).


Was this topic helpful?

[Select Rating]



How syslog-ng OSE interacts with Apache Kafka

When stopping the syslog-ng OSE application, syslog-ng OSE will not stop until all Java threads are finished, including the threads started by the Kafka Producer. There is no way (except for the kill -9 command) to stop syslog-ng OSE before the Kafka Producer stops. To change this behavior set the properties of the Kafka Producer in its properties file, and reference the file in the properties-file option.

The syslog-ng OSE kafka destination tries to reconnect to the brokers in a tight loop. This can look as spinning, because of a lot of similar debug messages. To decrease the amount of such messages, set a bigger timeout using the following properties:

retry.backoff.ms=1000
reconnect.backoff.ms=1000

For details on using property files, see properties-file(). For details on the properties that you can set in the property file, see the Apache Kafka documentation.


Was this topic helpful?

[Select Rating]



Related Documents