supervisor常用命令

查看任務(wù)狀態(tài)
# supervisorctl status
SmartCoin                        RUNNING   pid 13203, uptime 0:04:05
coin                             RUNNING   pid 30744, uptime 17 days, 20:45:18
deepwellserver                   RUNNING   pid 30257, uptime 30 days, 4:13:01
jingtumassetapi                  RUNNING   pid 14536, uptime 45 days, 19:18:08
moac                             RUNNING   pid 20015, uptime 15 days, 5:15:11
new                              RUNNING   pid 10041, uptime 43 days, 22:41:56
nginx                            RUNNING   pid 18752, uptime 22:59:40
redis                            RUNNING   pid 14542, uptime 45 days, 19:18:08
sonyflakeserver                  FATAL     can't find command 'go'
sparkportal                      RUNNING   pid 26073, uptime 1 day, 23:11:17
sparkportal2                     RUNNING   pid 25732, uptime 1 day, 23:11:21
sparkportal3                     RUNNING   pid 25834, uptime 1 day, 23:11:20
sparkportal4                     RUNNING   pid 25974, uptime 1 day, 23:11:18
sparkuser                        RUNNING   pid 26957, uptime 9 days, 23:07:21
sparkwallet                      RUNNING   pid 29045, uptime 5 days, 15:11:58
summaryservice                   RUNNING   pid 14535, uptime 45 days, 19:18:08

第一列是服務(wù)名;第二列是運行狀態(tài),RUNNING表示運行中,F(xiàn)ATAL 表示運行失敗,STARTING表示正在啟動,STOPED表示任務(wù)已停止; 第三/四列是進程號,最后是任務(wù)已經(jīng)運行的時間。

查看單個任務(wù)狀態(tài): supervisorctl status 服務(wù)名

# supervisorctl status sparkportal
sparkportal                      RUNNING   pid 26073, uptime 1 day, 23:12:10
啟動/停止/重啟任務(wù)
  1. 啟動任務(wù)
    supervisorctl start 服務(wù)名
# supervisorctl stop sparkportal
sparkportal: stopped
#supervisorctl status sparkportal
sparkportal                      STOPPED   Jan 05 01:59 PM
  1. 停止任務(wù)
    supervisorctl stop 服務(wù)名
# supervisorctl start sparkportal
sparkportal: started
# supervisorctl status sparkportal
sparkportal                      RUNNING   pid 32207, uptime 0:00:05
  1. 重啟任務(wù)
    supervisorctl restart 服務(wù)名
# supervisorctl restart sparkportal
sparkportal: stopped
sparkportal: started
# supervisorctl status sparkportal
sparkportal                      RUNNING   pid 4952, uptime 0:00:03
新增任務(wù)
  1. 任務(wù)模板
[program:<服務(wù)名>]
command=<啟動命令>
process_name=%(program_name)s ; process_name expr (default %(program_name)s)
numprocs=1                    ; number of processes copies to start (def 1)
directory=<運行目錄>                ; directory to cwd to before exec (def no cwd)
;umask=022                     ; umask for process (default None)
;priority=999                  ; the relative start priority (default 999)
autostart=true                ; start at supervisord start (default: true)
autorestart=unexpected        ; whether/when to restart (default: unexpected)
startsecs=1                   ; number of secs prog must stay running (def. 1)
startretries=3                ; max # of serial start failures (default 3)
exitcodes=0,2                 ; 'expected' exit codes for process (default 0,2)
stopsignal=QUIT               ; signal used to kill process (default TERM)
stopwaitsecs=10               ; max num secs to wait b4 SIGKILL (default 10)
stopasgroup=false             ; send stop signal to the UNIX process group (default false)
killasgroup=false             ; SIGKILL the UNIX process group (def false)
;user=skywell                  ; setuid to this UNIX account to run the program
;redirect_stderr=true          ; redirect proc stderr to stdout (default false)
stdout_logfile=/var/log/<服務(wù)名>.log        ; stdout log path, NONE for none; default AUTO
stdout_logfile_maxbytes=1MB   ; max # logfile bytes b4 rotation (default 50MB)
stdout_logfile_backups=1     ; # of stdout logfile backups (default 10)
stdout_capture_maxbytes=1MB   ; number of bytes in 'capturemode' (default 0)
stdout_events_enabled=false   ; emit events on stdout writes (default false)
stderr_logfile=/var/log/<服務(wù)名>.err        ; stderr log path, NONE for none; default AUTO
stderr_logfile_maxbytes=1MB   ; max # logfile bytes b4 rotation (default 50MB)
stderr_logfile_backups=10     ; # of stderr logfile backups (default 10)
stderr_capture_maxbytes=1MB   ; number of bytes in 'capturemode' (default 0)
stderr_events_enabled=false   ; emit events on stderr writes (default false)
environment=A="1",B="2",HOME="/home/skywell"       ; process environment additions (def no adds)
serverurl=AUTO                ; override serverurl computation (childutils)

首先添加任務(wù)描述文件,在/etc/supervisor目錄下新建文件sparkportal.conf, 將上面任務(wù)模板內(nèi)容復(fù)制進文件sparkportal.conf中,將<服務(wù)名>替換為任務(wù)名sparkportal,將<啟動命令>替換為node www.js,將<運行目錄>替換為程序所在目錄/usr/local/sparkportal/bin

sparkportal的配置文件為

[program:sparkportal]
command=node www.js
process_name=%(program_name)s ; process_name expr (default %(program_name)s)
numprocs=1                    ; number of processes copies to start (def 1)
directory=/usr/local/sparkportal/bin                ; directory to cwd to before exec (def no cwd)
;umask=022                     ; umask for process (default None)
;priority=999                  ; the relative start priority (default 999)
autostart=true                ; start at supervisord start (default: true)
autorestart=unexpected        ; whether/when to restart (default: unexpected)
startsecs=1                   ; number of secs prog must stay running (def. 1)
startretries=3                ; max # of serial start failures (default 3)
exitcodes=0,2                 ; 'expected' exit codes for process (default 0,2)
stopsignal=QUIT               ; signal used to kill process (default TERM)
stopwaitsecs=10               ; max num secs to wait b4 SIGKILL (default 10)
stopasgroup=false             ; send stop signal to the UNIX process group (default false)
killasgroup=false             ; SIGKILL the UNIX process group (def false)
;user=skywell                  ; setuid to this UNIX account to run the program
;redirect_stderr=true          ; redirect proc stderr to stdout (default false)
stdout_logfile=/var/log/sparkportal.log        ; stdout log path, NONE for none; default AUTO
stdout_logfile_maxbytes=1MB   ; max # logfile bytes b4 rotation (default 50MB)
stdout_logfile_backups=1     ; # of stdout logfile backups (default 10)
stdout_capture_maxbytes=1MB   ; number of bytes in 'capturemode' (default 0)
stdout_events_enabled=false   ; emit events on stdout writes (default false)
stderr_logfile=/var/log/sparkportal.err        ; stderr log path, NONE for none; default AUTO
stderr_logfile_maxbytes=1MB   ; max # logfile bytes b4 rotation (default 50MB)
stderr_logfile_backups=10     ; # of stderr logfile backups (default 10)
stderr_capture_maxbytes=1MB   ; number of bytes in 'capturemode' (default 0)
stderr_events_enabled=false   ; emit events on stderr writes (default false)
environment=A="1",B="2",HOME="/home/skywell"       ; process environment additions (def no adds)
serverurl=AUTO                ; override serverurl computation (childutils)
  1. 增加任務(wù)
    supervisorctl update
# supervisorctl update
sparkportal: added process group

該命令會將sparkportal.conf所描述的任務(wù)啟動并納入管理。然后運用查看任務(wù)命令即可查看新增任務(wù)的運行狀態(tài),如若運行失敗,可查看/usr/log目錄下的相關(guān)日志分析原因。

設(shè)置環(huán)境變量

在配置文件找到environment所在行,若沒有沒有最下面增加environment=變量名="變量值"即可,如果多個環(huán)境變量用逗號分隔,例如environment=變量名1="變量值1",變量名2="變量值2"

將nodejs運行環(huán)境設(shè)置為生產(chǎn)環(huán)境,增加如下代碼:
environment=NODE_ENV=production

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

  • Ubuntu的發(fā)音 Ubuntu,源于非洲祖魯人和科薩人的語言,發(fā)作 oo-boon-too 的音。了解發(fā)音是有意...
    螢火蟲de夢閱讀 100,704評論 9 468
  • Spring Cloud為開發(fā)人員提供了快速構(gòu)建分布式系統(tǒng)中一些常見模式的工具(例如配置管理,服務(wù)發(fā)現(xiàn),斷路器,智...
    卡卡羅2017閱讀 136,569評論 19 139
  • linux資料總章2.1 1.0寫的不好抱歉 但是2.0已經(jīng)改了很多 但是錯誤還是無法避免 以后資料會慢慢更新 大...
    數(shù)據(jù)革命閱讀 13,252評論 2 33
  • 1.Linux下如何用命令查看實時日志(完整命令) tail -f 路徑.log查看前多少行 tai-200f 路...
    qianyewhy閱讀 2,556評論 0 11
  • 黑客常用命令大全 net user heibai lovechina /add 加一個heibai的用戶密碼...
    倒帶默寫閱讀 17,279評論 0 24

友情鏈接更多精彩內(nèi)容