Text classification-FastText

1.Getting and preparing the data

每行包括:label,句子

>> head cooking.stackexchange.txt

__label__sauce __label__cheese How much does potato starch affect a cheese sauce recipe?__label__food-safety __label__acidity Dangerous pathogens capable of growing in acidic environments __label__cast-iron __label__stove How do I cover up the white spots on my cast iron stove?__label__restaurant Michelin Three Star Restaurant; but if the chef is not there__label__knife-skills__label__dicing Without knife skills, how can I quickly and accurately dice vegetables?

在訓(xùn)練之前將數(shù)據(jù)分為訓(xùn)練集和交叉驗(yàn)證集 4:1

>> wc cooking.stackexchange.txt

? 15404? 169582 1401900 cooking.stackexchange.txt

>> head -n 12404 cooking.stackexchange.txt > cooking.train

>> tail -n 3000 cooking.stackexchange.txt > cooking.valid

2.Our first classifier

訓(xùn)練分類器

>> ./fasttext supervised -input cooking.train -output model_cooking

Read 0M words

Number of words:? 14543

Number of labels: 735

Progress: 100.0% words/sec/thread:? 90012 lr:? 0.000000 loss: 10.222594 ETA:? 0h 0m

?-input 指示訓(xùn)練集?

-output定義在哪里存放model文件

訓(xùn)練的最后會(huì)生成 model.bin文件存放分類器


交互式測(cè)試分類器:

>> ./fasttext predict model_cooking.bin -

>> Why not put knives in the dishwasher?

__label__baking

>> Why not put knives in the dishwasher?

__label__food-safety

執(zhí)行交叉驗(yàn)證:

>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.138

R@1 0.0595

Number of examples: 3000

precision:P@1

recall:R@1

每5次迭代計(jì)算一次precision和recall

>> ./fasttext test model_cooking.bin cooking.valid 5

N 3000

P@5 0.0677

R@5 0.146

Number of examples: 3000

3.Advanced readers: precision and recall

The precision is the number of correct labels among the labels predicted by fastText.

The recall is the number of labels that successfully were predicted, among all the real labels.

4.Making the model better

4.1 preprocessing the data

>>cat cooking.stackexchange.txt | sed -e "s/\([.\!?,'/()]\)/ \1 /g" | tr "[:upper:]" "[:lower:]" > cooking.preprocessed.txt

head -n 12404 cooking.preprocessed.txt > cooking.train

?tail -n 3000 cooking.preprocessed.txt > cooking.valid

使用預(yù)訓(xùn)練過(guò)的數(shù)據(jù)訓(xùn)練模型:

>> ./fasttext supervised -input cooking.train -output model_cooking

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 101142?? lr: 0.000000?? loss: 11.018550?? ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.172

R@1 0.0744

Number of examples: 3000

4.2 more epochs and larger learning rate

默認(rèn)情況下fastText 只進(jìn)行5次迭代

使用?-epoch?自定義迭代次數(shù)

>> ./fasttext supervised -input cooking.train -output model_cooking -epoch 25

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 92990 ? lr: 0.000000?? loss: 7.257324 ? ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.514

R@1 0.222

Number of examples: 3000

加快學(xué)習(xí)速度--改變learning rate

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 91682???? lr: 0.000000 ? ? loss: 6.346271 ? ? ETA: 0h 0m


>> ./fasttext test model_cooking.bin cooking.valid

N 3000

P@1 0.579

R@1 0.25

Number of examples: 3000


4.3 word n-grams

使用bigrams訓(xùn)練模型

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0 -epoch 25 -wordNgrams 2

Read 0M words

Number of words: 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 93126 ?? lr: 0.000000 ? ? loss: 3.139972???? ETA: 0h 0m

>> ./fasttexttestmodel_cooking.bin cooking.valid

N 3000

P@1 0.61

R@1 0.264

Number of examples: 3000

提升模型準(zhǔn)確率的幾種方法:

preprocessing the data ;

changing the number of epochs (using the option-epoch, standard range[5 - 50]) ;

changing the learning rate (using the option-lr, standard range[0.1 - 1.0]) ;

using word n-grams (using the option-wordNgrams, standard range[1 - 5]).

5.Scaling things up

使用 hierarchical softmax可以讓模型訓(xùn)練的更快 This can be done with the option -loss hs:

>> ./fasttext supervised -input cooking.train -output model_cooking -lr 1.0 -epoch 25 -wordNgrams 2 -bucket 200000 -dim 50 -loss hs

Read 0M words

Number of words:? 8952

Number of labels: 735

Progress: 100.0% words/sec/thread: 2139399 lr:? 0.000000 loss:? 2.142308 ETA:? 0h 0m

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡(jiǎn)書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

  • rljs by sennchi Timeline of History Part One The Cognitiv...
    sennchi閱讀 7,854評(píng)論 0 10
  • 婚外情似乎成了這個(gè)社會(huì)很正常的現(xiàn)象,基本上所有人都覺得這是一種不道德的行為,也有部分經(jīng)歷過(guò)的人說(shuō)能夠與那些有這種經(jīng)...
    家家有999本難念的經(jīng)閱讀 851評(píng)論 0 1
  • 很久以前聽過(guò)這本書,心里還想它寫的是啥,直到看過(guò)后才感受到這部作品的厚重感,與其說(shuō)它是奧斯卡·辛德勒的個(gè)人...
    允棄閱讀 1,399評(píng)論 3 1
  • 在MVC模式中,M:模型,是用來(lái)做數(shù)據(jù)操作的;V:視圖,是用來(lái)展示界面的;C:控制器,對(duì)視圖中的控件的行為進(jìn)行操作...
    FallPine閱讀 309評(píng)論 0 0
  • 風(fēng)總是吹著 吹得讓人失了方向 雨總是下著 浸濕了人的衣裳 手邊落滿灰塵的那本書 藏的是誰(shuí)的故事? “...
    一只蘿卜呀閱讀 629評(píng)論 2 5

友情鏈接更多精彩內(nèi)容