機器學(xué)習(xí):9. 模型調(diào)參 Model Tuning

@[toc]

Manual Hyperparameter Tuning

  • Start with a good baseline, e.g. default settings in high-quality toolkits, values reported in papers

  • Tune a value, retrain the model to see the changes

  • Repeat multiple times to gain insights about

    • Which hyperparameters are important

    • How sensitive the model to hyperparameters

    • What are the good ranges

  • Needs careful experiment management

  • Save your training logs and hyperparameters to compare, share and
    reproduce later

    • The simplest way is saving logs in text and put key metrics in Excel

    • Better options exist, e.g. tenesorboard and weights & bias

  • Reproducing is hard, it relates to

    • Environment (hardware & library)

    • Code

    • Randomness (seed)

Automated Machine Learning (AutoML)

  • Automate every step in applying ML to solve real-world problems: data cleaning, feature extraction, model selection…
  • Hyperparameter optimization (HPO):find a good set of hyperparameters
    through search algorithms
  • Neural architecture search (NAS):construct a good neural network model

Summary

  • Hyperparameter tuning aims to find a set of good values
  • It’s time consuming as data preprocessing
  • There is a trend to use algorithm for tuning
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容