Writing Logging Output to Kafka

A log writer that sends logging output to Kafka. This provides a convenient means for tools in the Hadoop ecosystem, such as Storm, Spark, and others, to process the data generated by Bro.

Installation

Install librdkafka (https://github.com/edenhill/librdkafka), a native client library for Kafka. This plugin has been tested against the latest release of librdkafka, which at the time of this writing is v0.8.6:

# curl -L https://github.com/edenhill/librdkafka/archive/0.8.6.tar.gz | tar xvz
# cd librdkafka-0.8.6/
# ./configure
# make
# sudo make install

Then compile this Bro plugin using the following commands:

# ./configure --bro-dist=$BRO_SRC
# make
# sudo make install

Run the following command to ensure that the plugin was installed successfully:

# bro -N Bro::Kafka
Bro::Kafka - Writes logs to Kafka (dynamic, version 0.1)

Activation

The easiest way to enable Kafka output is to load the plugin’s logs-to-kafka.bro script. If you are using BroControl, any of the following examples added to local.bro will activate it.

In this example, all HTTP, DNS, and Conn logs will be sent to a Kafka Broker running on the localhost. By default, the log stream’s path will define the topic name. The Conn::LOG will be sent to the topic conn and the HTTP::LOG will be sent to the topic named http.

@load Bro/Kafka/logs-to-kafka.bro
redef Kafka::logs_to_send = set(Conn::LOG, HTTP::LOG);
redef Kafka::kafka_conf = table(
    ["metadata.broker.list"] = "localhost:9092"
);

If all log streams need to be sent to the same topic, define the name of the topic in a variable called topic_name. In this example, both Conn::LOG and HTTP::LOG will be sent to the topic named bro.

@load Bro/Kafka/logs-to-kafka.bro
redef Kafka::logs_to_send = set(Conn::LOG, HTTP::LOG);
redef Kafka::kafka_conf = table(
    ["metadata.broker.list"] = "localhost:9092"
);
redef Kafka::topic_name = "bro";

It is also possible to send each log stream to a unique topic and also customize those topic names. This can be done through the same mechanism in which the name of a log file for a stream is customized. Here is an old example (look for the $path_func field) http://blog.bro.org/2012/02/filtering-logs-with-bro.html.

Settings

kafka_conf

The global configuration settings for Kafka. These values are passed through directly to librdkafka. Any valid librdkafka settings can be defined in this table.

redef Kafka::kafka_conf = table(
    ["metadata.broker.list"] = "localhost:9092",
    ["client.id"] = "bro"
);

topic_name

The name of the topic in Kafka that all Bro log streams will be sent to. If each log stream needs to be sent to a unique topic, this value should be left undefined.

redef Kafka::topic_name = "bro";

max_wait_on_shutdown

The maximum number of milliseconds that the plugin will wait for any backlog of queued messages to be sent to Kafka before forced shutdown.

redef Kafka::max_wait_on_shutdown = 3000;

tag_json

If true, a log stream identifier is appended to each JSON-formatted message. For example, a Conn::LOG message will look like { 'conn' : { ... }}.

redef Kafka::tag_json = T;

Next Page

Bro::Myricom

Copyright 2016, The Bro Project. Last updated on December 07, 2018. Created using Sphinx 1.8.2.