Tensorflow基本模型之隨機森林

隨機森林簡介

隨機森林是一種集成學(xué)習(xí)方法。訓(xùn)練時每個樹分類器從樣本集里面隨機有放回的抽取一部分進行訓(xùn)練。預(yù)測時將要分類的樣本帶入一個個樹分類器,然后以少數(shù)服從多數(shù)的原則,表決出這個樣本的最終分類類型。[^4]

設(shè)有N個樣本,M個變量(維度)個數(shù),該算法具體流程如下:

  1. 確定一個值m,它用來表示每個樹分類器選取多少個變量;
  2. 從數(shù)據(jù)集中有放回的抽取 k 個樣本集,用它們創(chuàng)建 k 個樹分類器。另外還伴隨生成了 k 個袋外數(shù)據(jù),用來后面做檢測。
  3. 輸入待分類樣本之后,每個樹分類器都會對它進行分類,然后所有分類器按照少數(shù)服從多數(shù)原則,確定分類結(jié)果。

重要參數(shù):

  1. 預(yù)選變量個數(shù) (即框架流程中的m);
  2. 隨機森林中樹的個數(shù)。

Tensorflow 隨機森林

from __future__ import print_function

import tensorflow as tf
from tensorflow.python.ops import resources
from tensorflow.contrib.tensor_forest.python import tensor_forest

# Ignore all GPUs, tf random forest does not benefit from it.
import os
os.environ["CUDA_VISIBLE_DEVICES"] = ""

補充:__futrure__[^1]
簡單來說,Python 的每個新版本都會增加一些新的功能,或者對原來的功能作一些改動。有些改動是不兼容舊版本的。從 Python 2.7 到 Python 3 就有不兼容的一些改動,如果你想在 Python 2.7 中使用 Python 3 的新特性,那么你就需要從__future__模塊導(dǎo)入。

  • division
10/3 = 3  # python2.7中,不導(dǎo)入__future__
10/3 = 3.3333333333333335  # python2.7中,導(dǎo)入__future__

很容易看出來,2.7中默認的整數(shù)除法是結(jié)果向下取整,而導(dǎo)入了future之后除法就是真正的除法了。這也是python2和python3的一個重要區(qū)別。

  • absolute_import
# python2.7中,在默認情況下,導(dǎo)入模塊是相對導(dǎo)入的(relative import),比如說
from . import json
from .json import json_dump

這些以'.'點導(dǎo)入的是相對導(dǎo)入,而絕對導(dǎo)入(absolute import)則是指從系統(tǒng)路徑sys.path最底層的模塊導(dǎo)入。比如:

import os
from os import sys
  • print_function
    這個就是最經(jīng)典的python2和python3的區(qū)別了,python2中print不需要括號,而在python3中則需要。
print "Hello world" # python2.7
print("Hello world") # python3

導(dǎo)入數(shù)據(jù)集

# Import MNIST data
from tensorflow.examples.tutorials.mnist import input_data
mnist = input_data.read_data_sets("./data/", one_hot=False)
Extracting ./data/train-images-idx3-ubyte.gz
Extracting ./data/train-labels-idx1-ubyte.gz
Extracting ./data/t10k-images-idx3-ubyte.gz
Extracting ./data/t10k-labels-idx1-ubyte.gz

設(shè)置參數(shù)

# Parameters
num_steps = 500 # Total steps to train
batch_size = 1024 # The number of samples per batch
num_classes = 10 # The 10 digits
num_features = 784 # Each image is 28x28 pixels
num_trees = 10
max_nodes = 1000

補充:Estimator API
Estimator 跟 Dataset 都是 Tensorflow 中的高級API。
Estimator(評估器)是一種創(chuàng)建 TensorFlow 模型的高級方法,它包括了用于常見機器學(xué)習(xí)任務(wù)的預(yù)制模型,當(dāng)然,你也可以使用它們來創(chuàng)建你的自定義模型。[^3]
contrib.tensor_forest 詳細的實現(xiàn)了隨機森林算法(Random Forests)評估器,并對外提供 high-level API。你只需傳入 params 到構(gòu)造器,params 使用 params.fill() 來填充,而不用傳入所有的超參數(shù),Tensor Forest 自己的 RandomForestGraphs 就能使用這些參數(shù)來構(gòu)建整幅圖。[^2]

# Input and Target data
X = tf.placeholder(tf.float32, shape=[None, num_features])
# For random forest, labels must be integers (the class id)
Y = tf.placeholder(tf.int32, shape=[None])

# Random Forest Parameters
hparams = tensor_forest.ForestHParams(num_classes=num_classes,
                                      num_features=num_features,
                                      num_trees=num_trees,
                                      max_nodes=max_nodes).fill()
# Build the Random Forest
forest_graph = tensor_forest.RandomForestGraphs(hparams)

INFO:tensorflow:Constructing forest with params =
INFO:tensorflow:{'num_trees': 10, 'max_nodes': 1000, 'bagging_fraction': 1.0, 'feature_bagging_fraction': 1.0, 'num_splits_to_consider': 28, 'max_fertile_nodes': 0, 'split_after_samples': 250, 'valid_leaf_threshold': 1, 'dominate_method': 'bootstrap', 'dominate_fraction': 0.99, 'model_name': 'all_dense', 'split_finish_name': 'basic', 'split_pruning_name': 'none', 'collate_examples': False, 'checkpoint_stats': False, 'use_running_stats_method': False, 'initialize_average_splits': False, 'inference_tree_paths': False, 'param_file': None, 'split_name': 'less_or_equal', 'early_finish_check_every_samples': 0, 'prune_every_samples': 0, 'num_classes': 10, 'num_features': 784, 'bagged_num_features': 784, 'bagged_features': None, 'regression': False, 'num_outputs': 1, 'num_output_columns': 11, 'base_random_seed': 0, 'leaf_model_type': 0, 'stats_model_type': 0, 'finish_type': 0, 'pruning_type': 0, 'split_type': 0}

損失函數(shù)

# Get training graph and loss
train_op = forest_graph.training_graph(X, Y)
loss_op = forest_graph.training_loss(X, Y)

# Measure the accuracy
infer_op, _, _ = forest_graph.inference_graph(X)
correct_prediction = tf.equal(tf.argmax(infer_op, 1), tf.cast(Y, tf.int64))
accuracy_op = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))

訓(xùn)練

# Initialize the variables (i.e. assign their default value) and forest resources
init_vars = tf.group(tf.global_variables_initializer(),
    resources.initialize_resources(resources.shared_resources()))

# Start TensorFlow session
sess = tf.train.MonitoredSession()

# Run the initializer
sess.run(init_vars)

# Training
for i in range(1, num_steps + 1):
    # Prepare Data
    # Get the next batch of MNIST data (only images are needed, not labels)
    batch_x, batch_y = mnist.train.next_batch(batch_size)
    _, l = sess.run([train_op, loss_op], feed_dict={X: batch_x, Y: batch_y})
    if i % 50 == 0 or i == 1:
        acc = sess.run(accuracy_op, feed_dict={X: batch_x, Y: batch_y})
        print('Step %i, Loss: %f, Acc: %f' % (i, l, acc))

# Test Model
test_x, test_y = mnist.test.images, mnist.test.labels
print("Test Accuracy:", sess.run(accuracy_op, feed_dict={X: test_x, Y: test_y}))
INFO:tensorflow:Graph was finalized.
INFO:tensorflow:Running local_init_op.
INFO:tensorflow:Done running local_init_op.
Step 1, Loss: -1.000000, Acc: 0.411133
Step 50, Loss: -254.800003, Acc: 0.892578
Step 100, Loss: -538.799988, Acc: 0.915039
Step 150, Loss: -826.599976, Acc: 0.922852
Step 200, Loss: -1001.000000, Acc: 0.926758
Step 250, Loss: -1001.000000, Acc: 0.919922
Step 300, Loss: -1001.000000, Acc: 0.933594
Step 350, Loss: -1001.000000, Acc: 0.916992
Step 400, Loss: -1001.000000, Acc: 0.916992
Step 450, Loss: -1001.000000, Acc: 0.927734
Step 500, Loss: -1001.000000, Acc: 0.917969
Test Accuracy: 0.9212

參考

[1] Python __future__ 模塊

[2] 【機器學(xué)習(xí)】在TensorFlow中構(gòu)建自定義Estimator:深度解析TensorFlow組件Estimator

[3] TensorFlow 1.3的Datasets和Estimator知多少?谷歌大神來解答

[4] 穆晨:隨機森林(Random Forest)

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容