filebeat-收集日志写入到Kafka

宁可枝头抱香死,何曾吹落北风中。这篇文章主要讲述filebeat-收集日志写入到Kafka相关的知识,希望能为你提供帮助。
filebeat 安装

root@ubuntu:/data# dpkg -i filebeat-6.8.1-amd64.deb

使用filebeat收集单个系统日志
1)测试写入本地文件
root@ubuntu:/data#grep -Ev ^$|# /etc/filebeat/filebeat.yml
--------------------------------------------------------------
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/nginx/*.log
#测试写入本地文件
output.file:
path: "/tmp"
filename: "filebeat.log"
--------------------------------------------------------------


2)写入kafka
【filebeat-收集日志写入到Kafka】
root@ubuntu:~# grep -Ev ^#|^$ /etc/filebeat/filebeat.yml
filebeat.inputs:
- type: log
enabled: true
paths:
- /var/log/nginx/*.log
document_type: "nginxlog-kafka"
exclude_lines: [^DBG]
exclude_files: [.gz$]

filebeat.config.modules:
path: $path.config/modules.d/*.yml
reload.enabled: false
setup.template.settings:
index.number_of_shards: 3
setup.kibana:
output.kafka:
hosts: ["192.168.47.113:9092","192.168.47.112:9092","192.168.47.111:9092"]
topic: "nginxlog-kafka"
partition.round_robin:
reachable_only: true
required_acks: 1
compression: gzip
max_message_bytes: 1000000
processors:
- add_host_metadata: ~
- add_cloud_metadata: ~


/usr/local/kafka/bin/kafka-topics.sh\\
--list \\
--zookeeper 192.168.47.111,192.168.47.112,192.168.47.113:2181


logstash读取kafka日志到elasticsearch
input
kafka
bootstrap_servers => "192.168.47.113:9092"
topics => ["nginxlog-kafka"]
codec => json


output
elasticsearch
hosts => ["192.168.47.106:9200"]
index => "kafka-nginx-log-%+YYYY.MM.dd"






    推荐阅读