flink-cdh-parcel測試安裝

先上flink-cdh-parcel測試安裝效果圖:

clipboard1.png
clipboard22.png
clipboard33.png

下載地址:https://archive.cloudera.com/csa/1.0.0.0

一:將FLINK-1.9.0-csa1.0.0.0-cdh6.3.0.jar放置到如下:

clipboard.png

二:重啟cloudera-server

clipboard1.png

三:下載parcel到httpd服務(wù)目錄

clipboard2.png
clipboard3.png

四:打開cdh設(shè)置,添加地址

clipboard4.png

五:刷新-下載-分配-激活-使用

clipboard5.png
clipboard6.png
clipboard7.png
clipboard8.png
clipboard9.png
clipboard10.png
clipboard11.png

六:添加服務(wù)flink

clipboard12.png
clipboard13png.png
clipboard14.png

七:重啟cms

clipboard16.png

八:查看flink

clipboard18.png
clipboard22.png
clipboard23.png
clipboard33.png

九:flink使用:

clipboard44.png
clipboard55.png

十:需要解決的問題:

  • web界面上傳jar按鈕沒出來,加載的時(shí)候有,加載完沒了


    clipboard66.png
clipboard77.png
  • 客戶端執(zhí)行flink命令報(bào)錯(cuò)
The program finished with the following exception:

org.apache.flink.util.FlinkException: Failed to retrieve job list.
    at org.apache.flink.client.cli.CliFrontend.listJobs(CliFrontend.java:445)
    at org.apache.flink.client.cli.CliFrontend.lambda$list$0(CliFrontend.java:427)
    at org.apache.flink.client.cli.CliFrontend.runClusterAction(CliFrontend.java:956)
    at org.apache.flink.client.cli.CliFrontend.list(CliFrontend.java:424)
    at org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1024)
    at org.apache.flink.client.cli.CliFrontend.lambda$main$10(CliFrontend.java:1096)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:422)
    at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1875)
    at org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
    at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
Caused by: java.util.concurrent.TimeoutException
    at org.apache.flink.runtime.concurrent.FutureUtils$Timeout.run(FutureUtils.java:998)
    at org.apache.flink.runtime.concurrent.DirectExecutorService.execute(DirectExecutorService.java:211)
    at org.apache.flink.runtime.concurrent.FutureUtils.lambda$orTimeout$14(FutureUtils.java:416)
    at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
    at java.util.concurrent.FutureTask.run(FutureTask.java:266)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
    at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
    at java.lang.Thread.run(Thread.java:748)

十一:網(wǎng)上手動(dòng)生成flink-parcel參考:

<meta charset="utf-8">

http://blog.51yip.com/hadoop/2362.html

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請結(jié)合常識與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容