激活函數(shù)在設計神經網絡上很關鍵。隱藏層的激活函數(shù)影響的是學習,輸出層影響的是輸出。
概述:
1.激活函數(shù)
2.隱藏層的激活函數(shù)
3.輸出層的激活函數(shù)
激活函數(shù)
激活函數(shù)定義了輸入的加權和是如何被轉化成輸出的。
一個神經網絡通常有三個部分:輸入層,隱藏層,輸出層。
所有隱藏層的激活函數(shù)一般相同,輸出層一般不同。激活函數(shù)一般可微。
隱藏層的激活函數(shù)
Rectified Linear Activation (ReLU)
Logistic (Sigmoid)
Hyperbolic Tangent (Tanh)
ReLU:max(0.0, x)
from matplotlib import pyplot
# rectified linear function
def rectified(x):
return max(0.0, x)
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [rectified(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
Sigmoid:1.0 / (1.0 + e^-x)
# example plot for the sigmoid activation function
from math import exp
from matplotlib import pyplot
# sigmoid activation function
def sigmoid(x):
return 1.0 / (1.0 + exp(-x))
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [sigmoid(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
tanh:
# example plot for the tanh activation function
from math import exp
from matplotlib import pyplot
# tanh activation function
def tanh(x):
return (exp(x) - exp(-x)) / (exp(x) + exp(-x))
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [tanh(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
如何選擇一個隱藏層激活函數(shù):

image.png
輸出層
Linear
Logistic (Sigmoid)
Softmax
線形的并不改變什么,而是直接返回值。
# example plot for the linear activation function
from matplotlib import pyplot
# linear activation function
def linear(x):
return x
# define input data
inputs = [x for x in range(-10, 10)]
# calculate outputs
outputs = [linear(x) for x in inputs]
# plot inputs vs outputs
pyplot.plot(inputs, outputs)
pyplot.show()
softmax和概率相關
sigmoid
如何選擇:
回歸問題用線性,
Binary Classification: One node, sigmoid activation.
Multiclass Classification: One node per class, softmax activation.

image.png