For the best web experience, please use IE11+, Chrome, Firefox, or Safari

Big data ingestion

Many big data projects run into the 80/20 rule. 80% of resources is spent getting data into their analytic tools and only 20% on analyzing the data. syslog-ng can deliver data from a wide variety of sources to Hadoop, Elasticsearch, MongoDB, and Kafka as well as many others.
Big data ingestion


Wide Variety of Data

Delivering data in disparate formats from systems, applications, and devices often requires multiple tools and special integration

Massive data volumes

Big data is, well, big. Many data sources can overwhelm data collection tools.

Difficult to access data

Most big data systems capture data from complex, distributed systems, often from multiple remote sites with a variety of connectivity and latency issues.

Incomplete data

Insights based on incomplete data are often wrong. In large environments, it’s easy to leak data during collection and ingestion.

High data ingestion costs

Getting data into data stores is often the most time-consuming and costly part of big data projects.

Varying data consumer requirements

Big data systems often serve a variety of data consumers, each having their own requirements.

Why syslog-ng?

Collect From a Wide Variety of Sources

syslog-ng can collect logs from legacy systems, web servers, SQL databases as well as any applications or devices generating JSON messages or text-based files.

Send to Hadoop

syslog-ng natively streams data to Hadoop Distributed File System as well as the MapR distributed file system. We have partnered with Hadoop pioneers Hortonworks and MapR to ensure seamless data transfer from syslog-ng to your Hadoop cluster.

Flexibly Route Data

Data consumers have different needs and many organizations use a variety of data management and analysis tools, syslog-ng can flexibly route data from X sources to Y destinations.

Send to Elasticsearch

syslog-ng streams log data directly to Elasticsearch, one of the most popular enterprise search engines. syslog-ng delivers log data to Elasticsearch in JSON format where it can be searched.

Transform data in real-time

Most big data systems capture data from complex, distributed systems, often from multiple remote sites with a variety of connectivity and latency issues.

Send to MongoDB

syslog-ng sends log data to MongoDB, a popular document NoSQL database. With its schema-free, document-oriented architecture, MongoDB is used as a backend to many websites and services.

Deliver Data with End-to-End Reliability

Data transfer over TCP and the Reliable Log Transfer Protocol (RLTP),local disk buffering, client-side failover and other features ensure zero message loss, giving you confidence in your data.

Send to Apache Kafka

syslog-ng directly sends data messages to Apache Kafka, a messaging system designed for streaming data use cases with large amounts of data and low latency requirements. syslog-ng publishes log messages to Kafka where subscribers can access them.


Reduced deployment and maintenance costs

syslog-ng’s architecture enables you to deploy the same software packages on a more than 50 different server platforms.

Confidence in your data

Getting all of your data means you can rely on the insights derived from your analytic platform.

Data when you need it

Having data delivered in real-time means you can react and get answers faster.

Secure data

End to end encryption prevents unwanted 3rd party access.