立體視覺中幾種Cost Volume的理解

在立體視覺的雙目匹配的過程中存在Cost Volume的基本概念。以下介紹幾種常見的Cost Volume計(jì)算方式及對(duì)應(yīng)的理解。
首先展示3種典型方式的代碼實(shí)現(xiàn):

def forward(self, left_feature, right_feature):
    b, c, h, w = left_feature.size()
    if self.feature_similarity == 'difference':
        cost_volume = left_feature.new_zeros(b, c, self.max_disp, h, w)  # [B, C, D, H, W]
        for i in range(self.max_disp):
            if i > 0:
                cost_volume[:, :, i, :, i:] = left_feature[:, :, :, i:] - right_feature[:, :, :, :-i]
            else:
                cost_volume[:, :, i, :, :] = left_feature - right_feature
    elif self.feature_similarity == 'concat':
        cost_volume = left_feature.new_zeros(b, 2 * c, self.max_disp, h, w)  # [B, 2C, D, H, W]
        for i in range(self.max_disp):
            if i > 0:
                cost_volume[:, :, i, :, i:] = torch.cat((left_feature[:, :, :, i:], right_feature[:, :, :, :-i]), dim=1)
            else:
                cost_volume[:, :, i, :, :] = torch.cat((left_feature, right_feature), dim=1)
    elif self.feature_similarity == 'correlation':
        cost_volume = left_feature.new_zeros(b, self.max_disp, h, w)  # [B, D, H, W]
        for i in range(self.max_disp):
            if i > 0:
                cost_volume[:, i, :, i:] = (left_feature[:, :, :, i:] * right_feature[:, :, :, :-i]).mean(dim=1)
            else:
                cost_volume[:, i, :, :] = (left_feature * right_feature).mean(dim=1)
    else:
        raise NotImplementedError
    return cost_volume

Difference

其中difference方式是最直觀最容易理解的,即在left_featureright_feature逐漸“錯(cuò)開”的過程中對(duì)應(yīng)的逐元素相減,元素之間的差值即表示兩個(gè)特征的在當(dāng)前位置匹配程度。舉例如下:

n = 3
m = 4
left = np.tile(np.arange(m), (n, 1))
right = np.tile(np.arange(1, m + 1), (n, 1))
max_dis = 3
result = np.ones((max_dis, n, m)) * 0
for i in range(max_dis):
    if i > 0:
        result[i, :, i:] = left[:, i:] - right[:, :-i]
    else:
        result[i, :, i:] = left - right
left:
[[0 1 2 3]
 [0 1 2 3]
 [0 1 2 3]]

right:
[[1 2 3 4]
 [1 2 3 4]
 [1 2 3 4]]

簡單起見,假設(shè)C維度不存在,只考慮H,W維度。存在兩個(gè)[3,4]的左右特征。設(shè)max_dis = 3,則首先創(chuàng)建一個(gè)[3,3,4]result

result:
[[[0. 0. 0. 0.]
  [0. 0. 0. 0.]
  [0. 0. 0. 0.]]

 [[0. 0. 0. 0.]
  [0. 0. 0. 0.]
  [0. 0. 0. 0.]]

 [[0. 0. 0. 0.]
  [0. 0. 0. 0.]
  [0. 0. 0. 0.]]]

for i in range(max_dis)中將每一次“錯(cuò)位”后相減的值賦值到result對(duì)應(yīng)維度[i,3,4]中,最終得到:

result:
[[[-1. -1. -1. -1.]
  [-1. -1. -1. -1.]
  [-1. -1. -1. -1.]]

 [[ 0.  0.  0.  0.]
  [ 0.  0.  0.  0.]
  [ 0.  0.  0.  0.]]

 [[ 0.  0.  1.  1.]
  [ 0.  0.  1.  1.]
  [ 0.  0.  1.  1.]]]

可以看到在i=1的位置,result中的值全為0,即在“錯(cuò)位”1個(gè)距離的時(shí)候,left_featureright_feature完全匹配,這與我們的觀察吻合。

Concat

這種方式?jīng)]有什么特別值得說的,只是直接將兩個(gè)特征在C維度上進(jìn)行了疊加。

Correlation

假設(shè)特征圖的形狀均為[C,H,W],可以將它視為由H*W個(gè)元素組成,其中每個(gè)元素是一個(gè)長度為C的向量。事實(shí)上,在卷積神經(jīng)網(wǎng)絡(luò)中,一張?jiān)紙D像某一塊區(qū)域的特征也正是由這樣一個(gè)向量表示的。
對(duì)于兩個(gè)高維向量相似度的衡量,可以使用點(diǎn)積的方式:
\vec{a} \cdot \vec = ||\vec{a}|| ||\vec|| \cos{\theta}
從幾何角度看,點(diǎn)積是兩個(gè)向量的長度與它們夾角余弦的積,也可以理解為\vec{a}\vec方向上的投影與的乘積。這反映了兩個(gè)向量在方向上的相似度,結(jié)果越大越相似。若兩個(gè)向量正交垂直,則結(jié)果為0。對(duì)兩個(gè)[C,1,1]的向量逐元素相乘并相加正表示了這兩個(gè)向量點(diǎn)積的結(jié)果。為了去除向量本身模長對(duì)結(jié)果的影響,可以先對(duì)矩陣在C維度上進(jìn)行歸一化。
舉例如下:

left:
tensor([[[7., 6., 5., 4., 3.],
         [7., 6., 5., 4., 3.],
         [7., 6., 5., 4., 3.]],

        [[1., 2., 3., 4., 5.],
         [1., 2., 3., 4., 5.],
         [1., 2., 3., 4., 5.]]])
right:
tensor([[[5., 4., 3., 2., 1.],
         [5., 4., 3., 2., 1.],
         [5., 4., 3., 2., 1.]],

        [[3., 4., 5., 6., 7.],
         [3., 4., 5., 6., 7.],
         [3., 4., 5., 6., 7.]]])

存在兩個(gè)[2,3,5]的特征矩陣。使用x_normalized = F.normalize(x, dim=0)對(duì)它們?cè)?code>C維度進(jìn)行歸一化,得到:

left_normalized:
tensor([[[0.9899, 0.9487, 0.8575, 0.7071, 0.5145],
         [0.9899, 0.9487, 0.8575, 0.7071, 0.5145],
         [0.9899, 0.9487, 0.8575, 0.7071, 0.5145]],

        [[0.1414, 0.3162, 0.5145, 0.7071, 0.8575],
         [0.1414, 0.3162, 0.5145, 0.7071, 0.8575],
         [0.1414, 0.3162, 0.5145, 0.7071, 0.8575]]])
right_normalized:
tensor([[[0.8575, 0.7071, 0.5145, 0.3162, 0.1414],
         [0.8575, 0.7071, 0.5145, 0.3162, 0.1414],
         [0.8575, 0.7071, 0.5145, 0.3162, 0.1414]],

        [[0.5145, 0.7071, 0.8575, 0.9487, 0.9899],
         [0.5145, 0.7071, 0.8575, 0.9487, 0.9899],
         [0.5145, 0.7071, 0.8575, 0.9487, 0.9899]]])

設(shè)max_dis = 5,則首先創(chuàng)建一個(gè)[2,5,3,5]result。在for i in range(max_dis)中將這兩個(gè)歸一化后的矩陣點(diǎn)積的值賦值到result對(duì)應(yīng)維度[2,i,3,5]中,得到:

result:
[[[[0.84887475 0.67082036 0.44117653 0.2236068  0.07276069]
   [0.84887475 0.67082036 0.44117653 0.2236068  0.07276069]
   [0.84887475 0.67082036 0.44117653 0.2236068  0.07276069]]

  [[0.         0.81348926 0.6063391  0.36380345 0.16269785]
   [0.         0.81348926 0.6063391  0.36380345 0.16269785]
   [0.         0.81348926 0.6063391  0.36380345 0.16269785]]

  [[0.         0.         0.73529422 0.49999997 0.26470593]
   [0.         0.         0.73529422 0.49999997 0.26470593]
   [0.         0.         0.73529422 0.49999997 0.26470593]]

  [[0.         0.         0.         0.6063391  0.36380345]
   [0.         0.         0.         0.6063391  0.36380345]
   [0.         0.         0.         0.6063391  0.36380345]]

  [[0.         0.         0.         0.         0.44117653]
   [0.         0.         0.         0.         0.44117653]
   [0.         0.         0.         0.         0.44117653]]]


 [[[0.07276069 0.2236068  0.44117653 0.67082036 0.84887475]
   [0.07276069 0.2236068  0.44117653 0.67082036 0.84887475]
   [0.07276069 0.2236068  0.44117653 0.67082036 0.84887475]]

  [[0.         0.16269785 0.36380345 0.6063391  0.81348926]
   [0.         0.16269785 0.36380345 0.6063391  0.81348926]
   [0.         0.16269785 0.36380345 0.6063391  0.81348926]]

  [[0.         0.         0.26470593 0.49999997 0.73529422]
   [0.         0.         0.26470593 0.49999997 0.73529422]
   [0.         0.         0.26470593 0.49999997 0.73529422]]

  [[0.         0.         0.         0.36380345 0.6063391 ]
   [0.         0.         0.         0.36380345 0.6063391 ]
   [0.         0.         0.         0.36380345 0.6063391 ]]

  [[0.         0.         0.         0.         0.44117653]
   [0.         0.         0.         0.         0.44117653]
   [0.         0.         0.         0.         0.44117653]]]]

再使用mean = torch.from_numpy(result).mean(dim=0)對(duì)結(jié)果在C維度上求平均值,得到:

mean:
tensor([[[0.4608, 0.4472, 0.4412, 0.4472, 0.4608],
         [0.4608, 0.4472, 0.4412, 0.4472, 0.4608],
         [0.4608, 0.4472, 0.4412, 0.4472, 0.4608]],

        [[0.0000, 0.4881, 0.4851, 0.4851, 0.4881],
         [0.0000, 0.4881, 0.4851, 0.4851, 0.4881],
         [0.0000, 0.4881, 0.4851, 0.4851, 0.4881]],

        [[0.0000, 0.0000, 0.5000, 0.5000, 0.5000],
         [0.0000, 0.0000, 0.5000, 0.5000, 0.5000],
         [0.0000, 0.0000, 0.5000, 0.5000, 0.5000]],

        [[0.0000, 0.0000, 0.0000, 0.4851, 0.4851],
         [0.0000, 0.0000, 0.0000, 0.4851, 0.4851],
         [0.0000, 0.0000, 0.0000, 0.4851, 0.4851]],

        [[0.0000, 0.0000, 0.0000, 0.0000, 0.4412],
         [0.0000, 0.0000, 0.0000, 0.0000, 0.4412],
         [0.0000, 0.0000, 0.0000, 0.0000, 0.4412]]])

可以看到在i=2的位置,result中的值全為0.5(由于對(duì)長度為2的C通道求了平均值),即在“錯(cuò)位”2個(gè)距離的時(shí)候,left_featureright_feature完全匹配,這與我們的觀察吻合。
通過這種方式求得的Cost Volume維度相比于前兩種更少,因此能夠減輕網(wǎng)絡(luò)技術(shù)的負(fù)擔(dān),加快運(yùn)算速度。

最后編輯于
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請(qǐng)聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時(shí)請(qǐng)結(jié)合常識(shí)與多方信息審慎甄別。
平臺(tái)聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點(diǎn),簡書系信息發(fā)布平臺(tái),僅提供信息存儲(chǔ)服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

友情鏈接更多精彩內(nèi)容