常用操作命令

hive帶日志啟動(dòng):
hive --hiveconf hive.root.logger=DEBUG,console (hive啟動(dòng)時(shí)用該命令替代)

登陸mysql 87
mysql -h 10.0.90.87 -P 3306 -u username -ppwd

逍遙模擬器連接adb
adb connect 127.0.0.1:21503

sudo su
ssh -p32022 hadoop101

linux 查看內(nèi)存占用
free -m
linux查看所有進(jìn)程
ps aux | less

flume-ng agent --conf-file /home/webadmin/liuhuanyu/flume-files/conf.txt --name a1

hive-site的jdbc配置
jdbc:mysql://realtime-3:3306/hive?characterEncoding=UTF-8

200機(jī)子上flume導(dǎo)日志命令
/home/webadmin/liuhuanyu/apache-flume-1.6.0-cdh5.13.0-bin/bin/flume-ng agent --conf-file /home/webadmin/liuhuanyu/flume-files/conf.txt --name a1

向hive中插入數(shù)據(jù)
insert into table log_db_200 partition(log_date="2013-03-08") (guid,create_time,userid,type_flag,type,recharge_z,recharge,conver,sys,give_z,free_recharge,free_give,rela_userid,rela_recharge,rela_give,sync_flag,mentor,energy,integral,income,vip_level,flow_23,flow_24,flow_25,log_flag) values(791936071859961901,'2018-02-28 00:00:00',19506395,1,64,27,0,0,0,3,368300,16101,25697072,54,6,0,0,0,30,30,3,4,50,0,5);

spark提交flume任務(wù)
/opt/ydbsoftware/spark-2.2.0/bin/spark-submit --jars /home/webadmin/liuhuanyu/flume-files/spark-streaming-flume_2.11-2.2.0.jar,/home/webadmin/liuhuanyu/flume-files/spark-streaming-flume-sink_2.11-2.2.0.jar,/home/webadmin/liuhuanyu/flume-files/flume-ng-configuration-1.6.0.jar,/home/webadmin/liuhuanyu/flume-files/flume-ng-core-1.6.0.jar,/home/webadmin/liuhuanyu/flume-files/flume-ng-sdk-1.6.0.jar /home/webadmin/liuhuanyu/flume-files/sparkTest.jar

spark提交kafka任務(wù)
/opt/ydbsoftware/spark-2.2.0/bin/spark-submit
--jars
/opt/ydbsoftware/spark-2.2.0/hadoop-lzo-0.4.15-cdh5.13.1.jar
--class com.esky.offline.LiveAction
/home/webadmin/liuhuanyu/test/live.jar

hive的元數(shù)據(jù)存儲(chǔ)位置
!connect jdbc:hive2://realtime-3:10000

清除hdfs上checkpoint的文件
hdfs dfs -rm -R hdfs://zhugeio/user/webadmin/liuhuanyu/logdata/checkpoint/*

spark 參數(shù)
--num-executors 100 --executor-memory 2G --executor-cores 4 --driver-memory 1G --conf spark.default.parallelism=1000 --conf spark.storage.memoryFraction=0.5 --conf spark.shuffle.memoryFraction=0.3 --conf spark.sql.hive.filesourcePartitionFileCacheSize=5368709100 --driver-class-path

spark 提交普通任務(wù)
/opt/ydbsoftware/spark-2.2.0/bin/spark-submit --jars /opt/ydbsoftware/spark-2.2.0/hadoop-lzo-0.4.15-cdh5.13.1.jar,/home/webadmin/liuhuanyu/flume-files/mysql-connector-java-5.1.43.jar /home/webadmin/liuhuanyu/test/casual_play.jar

/opt/ydbsoftware/spark-2.2.0/bin/spark-submit --jars /opt/ydbsoftware/spark-2.2.0/hadoop-lzo-0.4.15-cdh5.13.1.jar,/home/webadmin/liuhuanyu/flume-files/mysql-connector-java-5.1.43.jar /home/webadmin/liuhuanyu/online/casual_play.jar

啟動(dòng)hue
nohup /home/webadmin/liuhuanyu/online/hue-4.0.0/build/env/bin/supervisor &

beeline連接hs2
!connect jdbc:hive2://realtime-3:10000;

查看hive的運(yùn)行配置
set -v;

java運(yùn)行jar包 指定依賴
java -Djava.ext.dirs=/home/webadmin/liuhuanyu/impala-files -cp base_util.jar jdbc.util.ImpalaUtil

hue測(cè)試運(yùn)行命令
nohup /home/webadmin/liuhuanyu/online/hue-4.0.0/build/env/bin/hue runserver 10.0.3.201:8000 &

// hive開啟權(quán)限控制和權(quán)限設(shè)置
set hive.security.authorization.task.factory=org.apache.hadoop.hive.ql.parse.authorization.HiveAuthorizationTaskFactoryImpl;
set hive.security.authorization.createtable.owner.grants=ALL ;
set hive.security.authorization.enabled=true ;

hive 查看權(quán)限
show role grant user ipaychat

hive 賦予權(quán)限
grant role pm to user maojunchi

zk消費(fèi)者
bin/kafka-console-consumer.sh --zookeeper 10.0.0.175:9093 --from-beginning --topic topic_data_stat2

print正則替換:
print((.+))
print("1 = "+ str(1))

sqoop導(dǎo)數(shù)據(jù)
/opt/sqoop/sqoop-1.4.6-cdh5.13.0/bin/sqoop import
--connect jdbc:mysql://192.168.90.231:3306/yt_fl_video_chat_log
--username ytflhdcaiy --password-file file:/root/.sqoop_mysql.password
--target-dir '/user/hive/warehouse/fl_video.db/t_videopair_log/log_date='{sPartition}'' \ --delete-target-dir \ --table t_videopair_log_{sPartition}
--fields-terminated-by '|'
--null-string '\N'
--null-non-string '\N'

apache sentry 啟動(dòng)的命令(103上)
sentry --command service --conffile ${SENTRY_HOME}/conf/sentry-site.xml

python 啟動(dòng)cgi http服務(wù)
python3 -m http.server --cgi 10086

crul使用http訪問(wèn)
curl 0.0.0.0:10086/cgi-bin/test.sh

查看zookeeper版本號(hào)
echo stat|nc 127.0.0.1 2181

kafka包下啟動(dòng)zookeeper:
bin\windows\zookeeper-server-start.bat config\zookeeper.properties

kafka包下啟動(dòng)broker:
bin\windows\kafka-server-start.bat config\server.properties

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

  • Linux命令收集 1、文件處理命令:ls 功能描述:顯示目錄文件 命令英文原意:list 命令所在路徑:/bin...
    guiwuzhe閱讀 979評(píng)論 0 0
  • 由于自己在使用終端的時(shí)候有一些指令經(jīng)常忘記,所以從網(wǎng)上收集了一些。并記錄下來(lái) OSX的文件操作系統(tǒng) OSX采用的是...
    云溪_Cloud閱讀 449評(píng)論 0 0
  • 由于自己在使用終端的時(shí)候有一些指令經(jīng)常忘記,所以從網(wǎng)上收集了一些。并記錄下來(lái) OSX的文件操作系統(tǒng) OSX采用的...
    我真的真的是文藝青年閱讀 904評(píng)論 2 9
  • 介紹 針對(duì)移動(dòng)端Android的測(cè)試,掌握adb命令將會(huì)為Android測(cè)試帶來(lái)很大的便利。adb全稱是Andro...
    Alex_tester閱讀 3,543評(píng)論 0 0
  • Docker 三大組件 倉(cāng)庫(kù):存放各種各樣已經(jīng)打包好的Docker應(yīng)用鏡像:按照Docker的規(guī)則制作的應(yīng)用。類似...
    wuzsheng閱讀 509評(píng)論 0 0

友情鏈接更多精彩內(nèi)容