安装环境
⚫ 虚拟机系统: Ubuntu
⚫ Hadoop 安装模式:伪分布式(单节点)
⚫ 安装包: flume( 1.7.0 )

安装步骤
(1)解压文件
tar -zvxf apache-flume-1.7.0-bin.tar.gz
(2)转移位置,移动到/usr/local/文件夹下
sudo mv apache-flume-1.7.0-bin /usr/local/
(3)配置环境变量
vim安装命令:sudo apt install vim
sudo vim /etc/profile
增加下面的代码,到最后
SQOOP_HOME=/usr/local/sqoop-1.4.7
PATH=$SQOOP_HOME/bin:$PATH
export SQOOP_HOME PATH
更新环境变量
source /etc/profile
(4)测试安装
flume-ng version
(5)配置文件
将flume-env.sh.template改名为flume-env.sh, 并修改其配置
在flume-1.7.0/conf目录下
cd /usr/local/flume-1.7.0/conf
sudo cp flume-env.sh.template flume-env.sh
sudo cp flume-conf.properties.template flume-conf.properties
sudo vim flume-1.7.0/conf/flume-env.sh
export JAVA_HOME=/usr/local/jdk1.8
sudo vim flume-1.7.0-conf.properties
增加下面内容,要根据自己的实际路径编辑
a1.sources = r1
a1.sinks = k1
a1.channels = c1
a1.sources.r1.type = exec
a1.sources.r1.command=tail -F /usr/local/hadoop-2.10.1/logs/hadoop-hadoop-namenode-master.log
a1.sinks.k1.type = hdfs
a1.sinks.k1.hdfs.path = hdfs://192.168.254.7:9000/tmp/flume/%Y%m%d
a1.sinks.k1.hdfs.filePrefix = log-
a1.sinks.k1.hdfs.fileType = DataStream
a1.sinks.k1.hdfs.useLocalTimeStamp = true
a1.channels.c1.type = memory
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
将hadoop的hdfs-site.xml和core-site.xml 放到flume-1.7.0/conf下
sudo rm /usr/local/flume-1.7.0/lib/guava-11.0.2.jar
测试启动
要根据自己的实际路径修改命令
flume-ng agent --conf conf/ --conf-file /usr/local/flume-1.7.0/conf/flume-conf.properties --name a1 -Dflume.root.logger=DEBUG,console
hdfs dfs -ls /tmp/flume