logstash out file to HDFS
logstash 直接把文件內(nèi)容寫入 hdfs 中, 并支持 hdfs 壓縮格式。
logstash 需要安裝第三方插件,webhdfs插件,通過hdfs的web接口寫入。
即 http://namenode00:50070/webhdfs/v1/ 接口
安裝
可以在官網(wǎng)找到相應(yīng)的版本, 我們用的是2.3.1,下載地址:
https://www.elastic.co/downloads/past-releases
webhdfs插件地址
github地址:
git clone https://github.com/heqin5136/logstash-output-webhdfs-discontinued.git
官網(wǎng)地址及使用說明:
https://www.elastic.co/guide/en/logstash/current/plugins-outputs-webhdfs.html
插件安裝方式:
logstash 安裝在 /home/mtime/logstash-2.3.1
git clone https://github.com/heqin5136/logstash-output-webhdfs-discontinued.git
cd logstash-output-webhdfs-discontinued
/home/mtime/logstash-2.3.1/bin/plugin install logstash-output-webhdfs-discontinued
檢查hdfs的webhds接口
curl -i "http://namenode:50070/webhdfs/v1/?user.name=hadoop&op=LISTSTATUS"
HTTP/1.1 200 OK
Cache-Control: no-cache
Expires: Thu, 13 Jul 2017 04:53:39 GMT
Date: Thu, 13 Jul 2017 04:53:39 GMT
Pragma: no-cache
Expires: Thu, 13 Jul 2017 04:53:39 GMT
Date: Thu, 13 Jul 2017 04:53:39 GMT
Pragma: no-cache
Content-Type: application/json
Set-Cookie: hadoop.auth="u=hadoop&p=hadoop&t=simple&e=1499957619679&s=KSxdSAtjXAllhn73vh1MAurG9Bk="; Path=/; Expires=Thu, 13-Jul-2017 14:53:39 GMT; HttpOnly
Transfer-Encoding: chunked
Server: Jetty(6.1.26)
注釋: active namenode 返回是200 ,standby namenode 返回是403.
配置
添加 logstash 一個(gè)配置文件
vim /home/mtime/logstash-2.3.1/conf/hdfs.conf
input {
kafka {
zk_connect => "192.168.51.191:2181,192.168.51.192:2181,192.168.51.193:2181" ## kafka zk 地址
group_id => 'hdfs' # 消費(fèi)者組
topic_id => 'tracks' # topic 名字
consumer_threads => 1
codec => 'json'
}
}
filter { ## 為解決 插入hdfs時(shí)間相差8小時(shí),
date {
match => [ "time" , "yyyy-MM-dd HH:mm:ss" ]
locale => "zh"
timezone => "-00:00:00"
target => "@timestamp"
}
}
output {
#if [app] == "mx.tc.virtualcard.service" {
webhdfs {
workers => 2
host => "namenode"
standby_host => "standbynamenode"
port => 50070
user => "loguser"
path => "/Service-Data/%{+YYYY}-%{+MM}-%{+dd}/%{app}/logstash-%{+HH}.log"
flush_size => 100
idle_flush_time => 10
compression => "gzip"
retry_interval => 3
codec => 'json' # 解決 寫入hdfs文件是json格式,否則內(nèi)容為 %{message}
}
# }
stdout { codec => rubydebug }
}
關(guān)于hdfs部分配置,可以在 plugins-outputs-webhdfs 官網(wǎng)找到。
啟動(dòng) logstart
cd /home/mtime/logstash-2.3.1/bin/
./logstash -f ../conf/hdfs.conf # 為前臺(tái)啟動(dòng)
我的 github 博客 https://sukbeta.github.io/2018/05/29/logstash-out-file-to-HDFS/