深度圖飛行像素去除

本科畢設(shè)就是去除深度圖中物體邊緣的噪聲,其中利用深度圖生成點云后,視覺干擾最大的就是飛行像素了。關(guān)于飛行像素的空間特性可以查看文章《Identification and correction of flying pixels in range camera data》。里面也介紹了一些恢復(fù)飛行像素正確深度的方法,但作者提出的方法用到平面檢測的思想,對一般曲面物體可能效果不好,而且該方法是離線處理的。本文記錄的去除飛行像素的方法利用了與深度圖對齊的彩色圖的物體邊緣信息,刪除深度圖和彩色圖邊緣之間的像素,雖然由于找到更好的方法刪除和恢復(fù)飛行像素的正確深度值,這個方法沒有寫在本科畢設(shè)里面,但該方法是快速的,可以用于對深度圖中重要的主體(比如人物)進(jìn)行去噪處理。
原來實現(xiàn)飛行像素去除是在Kinect v2上做的,昨天把它在Intel Realsense D435i 上實現(xiàn)一遍。本文記錄了三部分:
  • DLL文件打包
  • 標(biāo)定和配準(zhǔn)原理,實現(xiàn)
  • 邊緣檢測刪除飛行像素
先放實驗最終的效果吧,下面兩張圖分別是深度圖原圖和處理后的深度圖的可視化圖像:
depth_raw.jpg

depth_process.jpg
可以看到第二張圖像人物邊緣的黑邊擴張了,這是因為刪掉了邊緣的飛行像素。
打包.dll文件
打包成.dll文件只是我想放在其它工程里面而已,并不是必須的。這里僅記錄打包的過程防止后面忘掉了。
1.新建c++項目,win32控制臺程序,命名為RealsenseFilter.
2.下一步的“應(yīng)用程序類型”選擇“DLL”,“附加選項”勾選“空項目”。
3.在頭文件處新建一個名為MyRealsenseFilter.h的頭文件,內(nèi)容如下:
#pragma once
#ifndef LIBFREENECT_EXPORT
#define MYREALSENSE_API _declspec(dllexport)
#else
#define MYREALSENSE_API _declspec(dllimport)

#endif // LIBFREENECT_EXPORT


#include <stdio.h>
#include <iostream>
#include <opencv2/opencv.hpp>

#include <opencv2\core\core.hpp>
#include <opencv2\highgui\highgui.hpp>
#include <fstream>
#include <iostream>
#include <sstream>
#include <Eigen/Core>
#include <Eigen/LU>
#include <thread>
extern "C"
{
    namespace MyRealsense
    {
        MYREALSENSE_API unsigned short* DepthFilter(unsigned char* rgb_data, unsigned short* depth_data, int width_depth, int high_depth, int width_color, int high_color);
    }
}
在源文件處新建一個名為MyRealsenseFilter.cpp的源文件,內(nèi)容如下:
#include "MyRealsenseFilter.h"

using namespace cv;
using namespace std;


#define N1 18
#define N2 8
#define N3 2
#define winsize  3
#define threshold 19000

MYREALSENSE_API unsigned short* MyRealsense::DepthFilter(unsigned char* rgb_data, unsigned short* depth_data, int width_depth, int high_depth, int width_color, int high_color)
{   
    ////設(shè)置相機內(nèi)外參數(shù) 
 
    float fx_rgb = 611.538;
    float fy_rgb = 611.753;
    float cx_rgb = 323.879;
    float cy_rgb = 237.827;

    float fx_ir = 382.913;
    float fy_ir = 382.913;
    float cx_ir = 319.329;
    float cy_ir = 234.574;

    Eigen::Matrix3f R_ir2rgb;
    Eigen::Vector3f T_ir2rgb;

    R_ir2rgb << 0.999879, -0.00836045, 0.0131368, 0.00840608, 0.999959, -0.00342227, -0.0131076, 0.00353228, 0.999908;
    T_ir2rgb << 0.0150998, 0.000269174, 0.000144462;

    //cout << "R_M is:" << R_ir2rgb << endl;
    //cout << "T_M is :" << T_ir2rgb << endl;

    ////顯示傳進(jìn)來的圖片
    Mat rgb_raw = Mat(high_color, width_color, CV_8UC3);
    Mat depth_raw = Mat(high_depth, width_depth, CV_16UC1);
    Mat depth_raw_low = Mat(high_depth, width_depth, CV_8UC1);
    memcpy(rgb_raw.data, rgb_data, width_color*high_color * 3);
    memcpy(depth_raw.data, depth_data, width_depth*high_depth * 2);
    depth_raw.convertTo(depth_raw_low, CV_8UC1);
    //imshow("raw_depth:", depth_raw_low);
    //imshow("raw_rgb:", rgb_raw);
    //waitKey(1);
    
    ////設(shè)置相機內(nèi)參(非齊次形式)
    Eigen::Matrix3f K_ir;           // ir內(nèi)參矩陣
    K_ir <<
        fx_ir, 0, cx_ir,
        0, fy_ir, cy_ir,
        0, 0, 1;

    Eigen::Matrix3f K_rgb;          // rgb內(nèi)參矩陣
    K_rgb <<
        fx_rgb, 0, cx_rgb,
        0, fy_rgb, cy_rgb,
        0, 0, 1;

    ////計算等效 R,T 具體參考俺大師兄博客:https://blog.csdn.net/jiaojialulu/article/details/53154045
    Eigen::Matrix3f R;
    Eigen::Vector3f T;
    R = K_rgb*R_ir2rgb*K_ir.inverse();
    T = K_rgb*T_ir2rgb;
    
    /*
    cout << "K_rgb:\n" << K_rgb << endl;
    cout << "K_ir:\n" << K_ir << endl;
    cout << "R:\n" << R << endl;
    cout << "T:\n" << T << endl;
    */
    
    ////計算投影
    Mat result(high_color, width_color, CV_8UC3); //為深度圖每一點像素找到對應(yīng)顏色值
    int i = 0;
    for (int row = 0; row < high_depth; row++)
    {
        for (int col = 0; col < width_depth; col++)
        {
            unsigned short* p = (unsigned short*)depth_raw.data;
            unsigned short depthValue = p[row * 640 + col];
            if (depthValue != -std::numeric_limits<unsigned short>::infinity() && depthValue != 0 && depthValue != 65535)
            {
                // 投影到彩色圖上的坐標(biāo)
                Eigen::Vector3f uv_depth(col, row, 1.0f);                            // !!!p_ir
                Eigen::Vector3f uv_color = depthValue / 1000.f*R*uv_depth + T / 1000;   // !!!Z_rgb*p_rgb=R*Z_ir*p_ir+T; (除以1000,是為了從毫米變米)
                int X = static_cast<int>(uv_color[0] / uv_color[2]);                // !!!Z_rgb*p_rgb -> p_rgb
                int Y = static_cast<int>(uv_color[1] / uv_color[2]);   // Z相當(dāng)于縮放因子,在我手寫筆記里面有說明
                
                if ((X >= 0 && X < width_color) && (Y >= 0 && Y < high_color))
                {
                    //cout << "X:       " << X << "     Y:      " << Y << endl;
                    result.data[i * 3] = rgb_raw.data[3 * (Y * 640 + X)];
                    result.data[i * 3 + 1] = rgb_raw.data[3 * (Y * 640 + X) + 1];
                    result.data[i * 3 + 2] = rgb_raw.data[3 * (Y * 640 + X) + 2];
                }
                else //深度圖像素對應(yīng)的彩色圖像素超出彩色圖像范圍,將該點的值置零。
                {
                    result.data[i * 3] = 0;
                    result.data[i * 3 + 1] = 0;
                    result.data[i * 3 + 2] = 0;
                }                               
            }   
            else //深度值為0,最大值65535或者NAN
            {
                result.data[i * 3] = 0;
                result.data[i * 3 + 1] = 0;
                result.data[i * 3 + 2] = 0;
            }
            i++;
            
        }
    }
    //cout << "result size is :" << result.size() << endl;
    //imshow("reg_rgb:", result);
    //waitKey(1);

    ////邊緣檢測

    
    static unsigned short* new_depth_data = new unsigned short[width_depth * high_depth];
    
    Mat src2 = Mat(high_depth, width_depth, CV_8UC1);
    Mat rgbMat = Mat(high_color, width_color, CV_8UC3);
    Mat depthMat = Mat(high_depth, width_depth, CV_16UC1);
    Mat src1, src3, out1, out2;
    memcpy(rgbMat.data, result.data, width_color*high_color * 3);
    memcpy(depthMat.data, depth_data, width_depth*high_depth * 2);
    src1 = rgbMat.clone();
    src3 = depthMat.clone();
    src3.convertTo(src2, CV_8UC1);
    blur(src2, src2, Size(3, 3));
    Canny(src2, out2, 500, 350);
    //imshow("depth_edge:", out2);
    Canny(src1, out1, 480, 250);
    //imshow("rgb_edge", out1);
    //imshow("color", rgbMat);
    //imshow("depth", src2);

    
    //獲得良好的配準(zhǔn)彩色圖邊緣
    for (int i = N3; i < high_depth - N3; i++)
    {
        for (int j = N3; j < width_depth - N3; j++)
        {
            if (*(unsigned short*)(result.data + i*result.step[0] + j *result.step[1]) == 0)
            {
                for (int n = 0; n < N3; n++)
                {
                    if (*(out1.data + i*out1.step[0] + (j + n) *out1.step[1]) != 0)
                        *(out1.data + i*out1.step[0] + (j + n) *out1.step[1]) = 0;

                    if (*(out1.data + i*out1.step[0] + (j - n) *out1.step[1]) != 0)
                        *(out1.data + i*out1.step[0] + (j - n) *out1.step[1]) = 0;

                    if (*(out1.data + (i + n)*out1.step[0] + j  *out1.step[1]) != 0)
                        *(out1.data + (i + n)*out1.step[0] + j  *out1.step[1]) = 0;

                    if (*(out1.data + (i - n)*out1.step[0] + j  *out1.step[1]) != 0)
                        *(out1.data + (i - n)*out1.step[0] + j  *out1.step[1]) = 0;

                }
            }
        }
    }
    ////去除離群點
    for (int i = winsize; i < high_depth - winsize; i++)
    {
        for (int j = winsize; j < width_depth - winsize; j++)
        {
            int num = 0;
            for (int k = i - winsize; k <= i + winsize; k++)
            {
                for (int l = j - winsize; l <= j + winsize; l++)
                {
                    if (*(out1.data + k*out1.step[0] + l *out1.step[1]) != 0)
                        num++;
                }
            }
            if (num < 4)
                *(out1.data + i*out1.step[0] + j *out1.step[1]) = 0;

        }
    }
    
    //imshow("rgb_edge_processed:", out1);

    for (int i = N2; i < high_depth - N2; i++)
    {
        for (int j = N1; j < width_depth - N1; j++)
        {
            int pro = 1;
            if (*(out2.data + i*out2.step[0] + j *out2.step[1]) != 0 && *(out1.data + i*out1.step[0] + j *out1.step[1]) == 0)
            {
                for (int index1 = 1; index1 <= N1; index1++)
                {
                    if (*(out1.data + i*out1.step[0] + (j + index1) *out1.step[1]) != 0 )
                    {
                        pro = 0;
                        if (*(out1.data + i*out1.step[0] + (j + index1) *out1.step[1]) != 0 && index1 !=1 )
                        {
                            for (int k = 0; k < index1-1 ; k++)
                            {
                                *((unsigned short*)(depthMat.data + i*depthMat.step[0] + (j + k)*depthMat.step[1])) = 0;
                            }
                        }
                        
                        /*
                        if (*(out1.data + i*out1.step[0] + (j - index1) *out1.step[1]) != 0 )
                        {
                            for (int k = 0; k < index1 ; k++)
                            {
                                *((unsigned short*)(depthMat.data + i*depthMat.step[0] + (j - k)*depthMat.step[1])) = 0;
                            }
                        }
                        */

                        break;
                    }
                }
                if (pro == 1)
                {
                    for (int index2 = 1; index2 <= N2; index2++)
                    {
                        if (*(out1.data + (i + index2)*out1.step[0] + j *out1.step[1]) == 0 && *(out1.data + (i - index2)*out1.step[0] + j *out1.step[1]) == 0)
                            continue;
                        else
                        {
                            if (*(out1.data + (i + index2)*out1.step[0] + j *out1.step[1]) != 0 )
                            {
                                for (int k = 0; k < index2  ; k++)
                                {
                                    *((unsigned short*)(depthMat.data + (i + k)*depthMat.step[0] + j*depthMat.step[1])) = 0;
                                }
                            }

                            if (*(out1.data + (i - index2)*out1.step[0] + j *out1.step[1]) != 0 )
                            {
                                for (int k = 0; k < index2 ; k++)
                                {
                                    *((unsigned short*)(depthMat.data + (i - k)*depthMat.step[0] + j*depthMat.step[1])) = 0;
                                }
                            }



                            break;
                        }
                    }
                }
            }
    
        }
    }


    
    
    ////統(tǒng)計濾波刪除離群點
    Mat depth_threshold = Mat::zeros(high_depth, width_depth, CV_16UC1);
    for (int i = winsize; i < high_depth - winsize; i++)
    {
        for (int j = winsize; j < width_depth - winsize; j++)
        {
            unsigned short sum_depth = 0;
            for (int k = i - winsize + 1; k <= i + winsize - 1; k++)
            {
                for (int l = j - winsize + 1; l <= j + winsize - 1; l++)
                {
                    sum_depth += abs((*((unsigned short*)(depthMat.data + (k - 1) *depthMat.step[0] + l*depthMat.step[1]))
                        - *((unsigned short*)(depthMat.data + k *depthMat.step[0] + l *depthMat.step[1]))));
                    sum_depth += abs((*((unsigned short*)(depthMat.data + (k + 1) *depthMat.step[0] + l*depthMat.step[1]))
                        - *((unsigned short*)(depthMat.data + k *depthMat.step[0] + l *depthMat.step[1]))));
                    sum_depth += abs((*((unsigned short*)(depthMat.data + k  *depthMat.step[0] + (l + 1)*depthMat.step[1]))
                        - *((unsigned short*)(depthMat.data + k *depthMat.step[0] + l *depthMat.step[1]))));
                    sum_depth += abs((*((unsigned short*)(depthMat.data + k  *depthMat.step[0] + (l - 1)*depthMat.step[1]))
                        - *((unsigned short*)(depthMat.data + k *depthMat.step[0] + l *depthMat.step[1]))));
                }
            }
            *((unsigned short*)(depth_threshold.data + i *depth_threshold.step[0] + j * depth_threshold.step[1])) = sum_depth;

        }
    }

    //cout << depth_threshold << endl;

    for (int i = winsize; i < high_depth - winsize; i++)
    {
        for (int j = winsize; j < width_depth - winsize; j++)
        {
            if (*((unsigned short*)(depth_threshold.data + i *depth_threshold.step[0] + j * depth_threshold.step[1])) > threshold)
                *((unsigned short*)(depthMat.data + i *depthMat.step[0] + j * depthMat.step[1])) = 0;

        }
    }
    

    memcpy(new_depth_data, depthMat.data, width_depth*high_depth * 2);
    depthMat.convertTo(depthMat, CV_8UC1);
    //imshow("depth after prosess:", depthMat);
    //waitKey(1);
    
    return new_depth_data;

}
這時候編譯運行就可以生成.dll文件了,文件在你工程目錄的x64/Release/下。
4.測試一下生成的.dll文件是否可用,在同一工程下新建一個項目,命名為DLLTest,該項目的配置就按寫正常c++項目的配置即可:“win32控制臺應(yīng)用程序”,應(yīng)用程序類型:“控制臺應(yīng)用程序”,附加選項勾選“預(yù)編譯頭”。在源文件新建一個DllTest.cpp,內(nèi)容如下:
// DllTest.cpp : 定義控制臺應(yīng)用程序的入口點。
//
#include "stdafx.h"
#include <Windows.h>
#include <iostream>
#include <librealsense2/rs.hpp> // Include RealSense Cross Platform API
#include <opencv2/opencv.hpp>   // Include OpenCV API
#include <stdio.h>
#include "example.hpp"          // Include short list of convenience functions for rendering
#include <algorithm>  

using namespace cv;
using namespace std;

// 定義指向DLL中函數(shù)的指針
typedef unsigned short* (*DepthFilter)(unsigned char* , unsigned short* , int , int , int , int );

int  _tmain(int argc, _TCHAR* argv[]) try
{
    static unsigned short* depthData = new unsigned short[640 * 480 ];
    static unsigned char* rgbData = new unsigned char[640 * 480 * 3];
    static unsigned short* depthData1 = new unsigned short[640 * 480];

    HINSTANCE hlib = LoadLibrary(_T("RealSenseFilter.dll")); // 獲得DLL的實例(HINSTANCE類型是實例的句柄)
    if (!hlib)
    {
        std::cout << "獲取DLL實例失敗!" << std::endl;
        FreeLibrary(hlib);  // 調(diào)用函數(shù)FreeLibrary釋放DLL獲得的內(nèi)存。
        return -1;
    }

    DepthFilter MyDepthFilter = (DepthFilter)GetProcAddress(hlib, "DepthFilter");

    if (!MyDepthFilter)
    {
        std::cout << "函數(shù)指針為空!" << std::endl;
        FreeLibrary(hlib);
        return -2;
    }

    
    
    float get_depth_scale(rs2::device dev); //獲取深度圖像的縮放因子,這里是0.001
    rs2::colorizer color_map1;
    rs2::colorizer color_map2;
    rs2::pipeline pipe;
    rs2::config cfg;
    cfg.enable_stream(RS2_STREAM_DEPTH, 640, 480, RS2_FORMAT_Z16, 30);//自定義相機輸出流的參數(shù)
    cfg.enable_stream(RS2_STREAM_COLOR,640,480,RS2_FORMAT_RGB8,30);
    rs2::pipeline_profile profile = pipe.start(cfg);
    float depth_scale = get_depth_scale(profile.get_device());
    cout << "depth_scale is:" << depth_scale << endl;
    

    while (waitKey(1) < 0)
    {
        
        rs2::frameset data = pipe.wait_for_frames();
        rs2::frame depth = data.get_depth_frame();
        rs2::frame color = data.get_color_frame();

        memcpy(depthData, (void*)depth.get_data(), 640 * 480 * 2);
        memcpy(rgbData, (void*)color.get_data(), 640 * 480 * 3);
        rs2::frame depth1 = depth.apply_filter(color_map1);
        depthData1 = MyDepthFilter(rgbData, depthData, 640, 480, 640, 480);
        
        memcpy((void*)depth.get_data(), depthData1, 640 * 480 * 2);
        rs2::frame depth2 = depth.apply_filter(color_map2);
        const int w = depth2.as<rs2::video_frame>().get_width();
        const int h = depth2.as<rs2::video_frame>().get_height();

        // Create OpenCV matrix of size (w,h) from the colorized depth data
        Mat image1(Size(w, h), CV_8UC3, (void*)depth1.get_data(), Mat::AUTO_STEP);
        Mat image2(Size(w, h), CV_8UC3, (void*)depth2.get_data(), Mat::AUTO_STEP);
        imshow("color_depth_raw", image1);
        imshow("color_depth_process", image2);
        waitKey(1);

    }


    return EXIT_SUCCESS;
}

catch (const rs2::error & e)
{
    std::cerr << "RealSense error calling " << e.get_failed_function() << "(" << e.get_failed_args() << "):\n    " << e.what() << std::endl;
    return EXIT_FAILURE;
}
catch (const std::exception& e)
{
    std::cerr << e.what() << std::endl;
    return EXIT_FAILURE;
}

float get_depth_scale(rs2::device dev)
{
    // Go over the device's sensors
    for (rs2::sensor& sensor : dev.query_sensors())
    {
        // Check if the sensor if a depth sensor
        if (rs2::depth_sensor dpt = sensor.as<rs2::depth_sensor>())
        {
            return dpt.get_depth_scale();
        }
    }
    throw std::runtime_error("Device does not have a depth sensor");
}
如果讀取.dll失敗再從網(wǎng)上尋找解決方法把,可能是.dll文件依賴于其他的.dll而程序沒找到這些.dll,可以下載一個工具“Dependency Walker”來尋找缺失的.dll文件。
DLL文件配置到此結(jié)束。
相機標(biāo)定和配準(zhǔn)
原理公式手打麻煩,老樣子手寫拍照(捂臉)
標(biāo)定原理
標(biāo)定原理
深度圖和彩色圖對齊
深度圖和彩色圖對齊
Intel Realsense D435i相機內(nèi)參,深度相機到彩色相機的旋轉(zhuǎn)平移矩陣都可以在Realsense SDK 2.0的例程里面找到。(項目:"sensor-control")。不同大小的幀內(nèi)參不一樣,我代碼里面深度相機內(nèi)參對應(yīng)的是:640 * 480,Z16格式,30FPS的幀流。彩色相機內(nèi)參對應(yīng)的是:640 * 480,RGB8格式,30FPS的幀流。深度圖和彩色圖對齊,我這里是彩色圖對齊到深度圖,即為深度圖上每一個點找到對應(yīng)的顏色值。但注意深度相機和彩色相機是有一定水平距離的,而且兩個相機的視場角不一樣(深度相機視場角大一些),實際上不能為每個深度值都找到顏色值的。但為了處理深度圖也簡單這樣做了。至于代碼我是參考我大師兄的,大師兄博客鏈接在我代碼里面。想詳細(xì)理解配準(zhǔn)的過程可以看鏈接里的博客。
邊緣檢測除去飛行像素
思路很簡單,因為深度圖中物體邊緣存在飛行像素,故相對于“正確”的邊緣,深度圖中的物體邊緣會稍微延申出來一部分,怎么得到正確的邊緣呢,這就是為什么要得到與深度圖對齊的彩色圖的原因了,對齊后的彩色圖用Canny邊緣檢測得到一個含物體邊緣的圖像,再結(jié)合深度圖的物體邊緣,刪除它們之間的像素就可以了。具體可以看代碼。
?著作權(quán)歸作者所有,轉(zhuǎn)載或內(nèi)容合作請聯(lián)系作者
【社區(qū)內(nèi)容提示】社區(qū)部分內(nèi)容疑似由AI輔助生成,瀏覽時請結(jié)合常識與多方信息審慎甄別。
平臺聲明:文章內(nèi)容(如有圖片或視頻亦包括在內(nèi))由作者上傳并發(fā)布,文章內(nèi)容僅代表作者本人觀點,簡書系信息發(fā)布平臺,僅提供信息存儲服務(wù)。

相關(guān)閱讀更多精彩內(nèi)容

  • 轉(zhuǎn)載自VR設(shè)計云課堂[http://www.itdecent.cn/u/c7ffdc4b379e]Unity S...
    水月凡閱讀 1,166評論 0 0
  • 本次主要學(xué)習(xí)兩個使用標(biāo)定+變換矩陣進(jìn)行配準(zhǔn)的方法,我還沒有實際去做,暫時僅做理論學(xué)習(xí)。 1、Kinect深度圖與攝...
    暗夜望月閱讀 17,967評論 1 6
  • 好久不更的簡書又開更咯,只愿把更多的美好傳播給我親愛的朋友們。 今天分享一篇最近特別喜歡的別人的文章: 張悅?cè)坏摹?..
    安然不知閱讀 1,024評論 0 4
  • 作為007的開篇日志,談?wù)勛约簽槭裁醇尤?07,希望通過寫作提升哪些能力,以及在新的一年希望自己在哪些方面修煉的更...
    元寶007er閱讀 323評論 4 1
  • 2018年9月29日 星期六 孩子們?nèi)胪?、上學(xué)都愿意找個好班,怎么定義好班壞班呢?看老師?看班風(fēng)?看學(xué)生?看家長?...
    愛陽恒佳閱讀 712評論 1 3

友情鏈接更多精彩內(nèi)容