一、入門操作
1、創(chuàng)一個(gè)tensor矩陣
x = torch.rand(5, 3)
x
out:
tensor([[0.5051, 0.7017, 0.0170],
[0.1441, 0.2476, 0.5710],
[0.0452, 0.8690, 0.2387],
[0.5709, 0.0098, 0.6993],
[0.3203, 0.5124, 0.1010]])
創(chuàng)建好后可以直接打印,要比tensorflow好用太多
2、矩陣大小
這跟numpy略有區(qū)別,numpy是shape
pytorch是size
x.size()
out:
torch.Size([5, 3])
3、簡單計(jì)算
x=torch.rand(5,3)
y=torch.rand(5,3)
x
tensor([[0.5051, 0.7017, 0.0170],
[0.1441, 0.2476, 0.5710],
[0.0452, 0.8690, 0.2387],
[0.5709, 0.0098, 0.6993],
[0.3203, 0.5124, 0.1010]])
y
tensor([[0.6415, 0.5819, 0.3311],
[0.0086, 0.4336, 0.5773],
[0.3360, 0.5421, 0.1845],
[0.4490, 0.1557, 0.5100],
[0.0162, 0.5474, 0.3124]])
add=x + y #add=torch.add(x, y)
print(add)
out:
tensor([[1.1466, 1.2836, 0.3481],
[0.1527, 0.6813, 1.1483],
[0.3812, 1.4111, 0.4232],
[1.0199, 0.1655, 1.2094],
[0.3364, 1.0598, 0.4134]])
4、與Numpy的協(xié)同操作
tensor--numpy
a = torch.ones(5)
a
b = a.numpy()
b
out:
a:tensor([1., 1., 1., 1., 1.])
b:array([1., 1., 1., 1., 1.], dtype=float32)
numpy-- tensor
import numpy as np
a = np.ones(5)
b = torch.from_numpy(a)
b
tensor([1., 1., 1., 1., 1.], dtype=torch.float64)
二、求導(dǎo)原理
先定義一個(gè)x
x = torch.randn(3,4,requires_grad=True)
x
out:
tensor([[ 2.4921, 0.3292, 0.2324, -0.8859],
[-1.3799, 1.6637, -0.5004, -0.4578],
[-0.2573, -2.0164, 0.3258, 0.0283]], requires_grad=True)
我們看到這里多了一個(gè)參數(shù)requires_grad
所有的tensor都有
.requires_grad屬性,默認(rèn)為False,但是可以設(shè)置成自動(dòng)求導(dǎo)。具體方法就是在定義tensor的時(shí)候,讓這個(gè)屬性為True,需要注意的是,要想使x支持求導(dǎo),必須讓x為浮點(diǎn)類型
我們繼續(xù)liner—regress
b = torch.randn(3,4,requires_grad=True)
b
tensor([[-0.2642, 0.3113, 0.0120, -1.3174],
[ 0.1307, 1.8577, 0.0130, 0.3950],
[-0.3580, 1.3666, 0.2026, -0.4438]], requires_grad=True)
t = 2*x*x + b
y = t.sum()
y
tensor(23.7106, grad_fn=<SumBackward0>)
PyTorch里面,求導(dǎo)是調(diào)用
.backward()方法。直接調(diào)用backward()方法,會(huì)計(jì)算對(duì)計(jì)算圖葉節(jié)點(diǎn)的導(dǎo)數(shù)。獲取求得的導(dǎo)數(shù),用
.grad方法。
y.backward()
x.requires_grad, b.requires_grad, t.requires_grad
(True, True, True)

我們看到x,b,t的requires_grad都為True
x.grad
tensor([[ 9.9684, 1.3170, 0.9295, -3.5436],
[-5.5196, 6.6548, -2.0017, -1.8311],
[-1.0293, -8.0657, 1.3033, 0.1134]])

因?yàn)?img class="math-inline" src="https://math.jianshu.com/math?formula=y%20%3D%202x%5E2%20%2B%20b" alt="y = 2x^2 + b" mathimg="1">, y對(duì)x求導(dǎo)為4x
所以x.grad=4x
關(guān)于求導(dǎo)的過程可以參考鏈?zhǔn)椒▌t
需要注意的是:求導(dǎo),只能是【標(biāo)量】對(duì)標(biāo)量,或者【標(biāo)量】對(duì)向量/矩陣求導(dǎo)!