Flink 安裝
基本安裝文檔參考
https://blog.csdn.net/qq_36048223/article/details/116114765
異常一:parent directory /opt/flink/conf doesn't exist
不知啥原因,沒解壓過來,直接手動(dòng)解壓到該目錄
tar -zxvf flink-1.13.2-bin-scala_2.11.tgz -C /opt/flink
cd /opt/flink
mv flink-1.13.2/* /opt/flink
異常二:Sum of configured JVM Metaspace (256.000mb (268435456 bytes)) and JVM Overhead (192.000mb (201326592 bytes)) exceed configured Total Process Memory (256.000mb (268435456 bytes)).
https://blog.csdn.net/NDF923/article/details/123730372
集成 flink-cdc for hive
https://juejin.cn/post/7176084265161982008
https://nightlies.apache.org/flink/flink-docs-release-1.13/zh/docs/connectors/table/hive/overview/
異常三:com.google.common.base.Preconditions.checkArgument(ZLjava/lang/String;Ljava/lang/Object;)V
看我的另一篇文檔 :http://www.itdecent.cn/p/e6a76d8422d4
1、刪除flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar中的com.google.common.base.Preconditions.class
2、修改guava-28.0的源碼,在Preconditions.jar中增加
public static void checkArgument(String errorMessageTemplate, @Nullable Object p1) {
throw new IllegalArgumentException(*lenientFormat*(errorMessageTemplate, p1));
}
public static void checkArgument(
@Nullable String errorMessageTemplate,
Object @Nullable ... errorMessageArgs) {
throw new IllegalArgumentException(*lenientFormat*(errorMessageTemplate, errorMessageArgs));
}
3、把編譯后的com.google.common.base.Preconditions.class 替換到flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar中
4、替換的辦法
解壓:建一個(gè)空目錄,把jar包放進(jìn)去,使用jar xvf flink-sql-connector-hive-3.1.2_2.11-1.13.6.jar 解壓。解壓后注意手動(dòng)刪這個(gè)jar包本身
壓縮:進(jìn)入上面的目錄執(zhí)行 jar cvf flink-sql-connector-hive-3.1.2_2.11-1.13.6-update.jar .
異常四:[ERROR] Could not execute SQL statement. Reason:
java.lang.ClassNotFoundException: org.apache.hadoop.conf.Configuration
執(zhí)行:
export HADOOP_CLASSPATH=`hadoop classpath`
異常五:Exception: Connection refused: localhost/127.0.0.1:8081
停止flink的yarn-session模式,啟動(dòng)
異常六:flink sql clien shell無法提交sql到y(tǒng)arn運(yùn)行。
編輯:/var/lib/ambari-server/resources/stacks/HDP/{HDP_VERSION}/services/FLINK/package/scripts/flink.py
若已經(jīng)安裝還需要修改:/var/lib/ambari-agent/cache/stacks/HDP/{HDP_VERSION}/services/FLINK/package/scripts/flink.py
增加 --detached 參數(shù)即可
異常七:[ERROR] Could not execute SQL statement. Reason:
java.lang.RuntimeException: The Yarn application application_XXXX doesn't run anymore.
請(qǐng)注意:如果是用flink用戶(ambari默認(rèn))啟動(dòng)的yarn-session 則sql-client.sh也要在flink下才能正常提交sql到y(tǒng)arn用運(yùn)行
最后
最好把
export HADOOP_CLASSPATH=`hadoop classpath`
寫到/etc/profile中。這樣運(yùn)行sql-client會(huì)少很多麻煩