[Matlab]RGB-D vSLAM (1)

DDOKKONยท2025๋…„ 2์›” 8์ผ
post-thumbnail

[RGB-D (depth camera)๋ฅผ ์ด์šฉํ•œ vSLAM]
๐Ÿ˜ŠRGB-D vSLAM Example in matlab official
๋ชฉํ‘œ: RGB-D(=๋ށ์Šค์นด๋ฉ”๋ผ)๋ฅผ ์ด์šฉํ•œ 3D Slam์„ ๋งคํ‹€๋žฉ์—์„œ ๊ตฌํ˜„. ์˜ˆ์ œ ๋”ฐ๋ผ๊ฐ€๊ธฐ

1.Methodology of 3D Slam by RGB-D camera

SLAM flow๋Š” ์Šคํ…Œ๋ ˆ์˜ค ์นด๋ฉ”๋ผ์™€ ์œ ์‚ฌํ•œ ๊ตฌ์กฐ์ด๋‹ค.

1.Map initialize: RGB image + Depth image๋กœ 3D ๊ณต๊ฐ„๊ตฌ์„ฑ ์‹œ์ž‘

2.Tracking: ์—ฌ๋Ÿฌ์žฅ์˜ ์‚ฌ์ง„์„ ์ฐ์œผ๋ฉฐ, ํ‚คํ”„๋ ˆ์ž„๊ฐ„์˜ ํŠน์ง•์ ์˜ ๋ณ€ํ™”๋ฅผ ๊ณ„์‚ฐ

3.Local Mapping: ์ƒˆ๋กœ์šด ํ‚คํ”„๋ ˆ์ž„์ด ๋“ค์–ด์˜ฌ๋•Œ๋งˆ๋‹ค, map point๋ฅผ ๊ณ„์‚ฐํ•˜์—ฌ mapping

4.Loop closure: ๊ฐ™์€ ๊ณต๊ฐ„์— ๋‹ค์‹œ ๋Œ์•„์˜ฌ๋•Œ, ์ด์ „์— ์™”๋˜ ๊ณต๊ฐ„์ž„์„ ์ธ์ง€ํ•˜์—ฌ, ์˜ค์ฐจ(drift)๋ณด์ •

โš ๏ธ์˜ˆ์ œ๋ฅผ ๋ถˆ๋Ÿฌ์™€๋„ ํ—ฌํผํ•จ์ˆ˜๊ฐ€ ๋“ฑ๋ก์ด ์•ˆ๋˜๋Š” ๊ฒฝ์šฐ๊ฐ€ ์žˆ์—ˆ๋‹ค. ์ด ๊ฒฝ์šฐ ๋นˆ ํ•จ์ˆ˜ํŒŒ์ผ์„ ๋งŒ๋“ค์–ด์ฃผ๊ณ , ํ—ฌํผํ•จ์ˆ˜ ๋‚ด์šฉ์„ ์ง‘์–ด๋„ฃ์–ด์ฃผ์ž.




1. ๋ฐ์ดํ„ฐ์…‹ ๋‹ค์šด๋กœ๋“œ ๋ฐ ๋””๋ ‰ํ† ๋ฆฌ ์„ค์ •

baseDownloadURL = "https://cvg.cit.tum.de/rgbd/dataset/freiburg3/rgbd_dataset_freiburg3_long_office_household.tgz"; 
dataFolder      = fullfile(tempdir, 'tum_rgbd_dataset', filesep); 
options         = weboptions('Timeout', Inf);
tgzFileName     = [dataFolder, 'fr3_office.tgz'];
folderExists    = exist(dataFolder, 'dir');

% Create a folder in a temporary directory to save the downloaded file
if ~folderExists  
    mkdir(dataFolder); 
    disp('Downloading fr3_office.tgz (1.38 GB). This download can take a few minutes.') 
    websave(tgzFileName, baseDownloadURL, options); 
    
    % Extract contents of the downloaded file
    disp('Extracting fr3_office.tgz (1.38 GB) ...') 
    untar(tgzFileName, dataFolder); 
end
imageFolder = [dataFolder, 'rgbd_dataset_freiburg3_long_office_household/'];

imgFolderColor = [imageFolder,'rgb/'];
imgFolderDepth = [imageFolder,'depth/'];
imdsColor      = imageDatastore(imgFolderColor);
imdsDepth      = imageDatastore(imgFolderDepth);





2. image๋งˆ๋‹ค (RGB์ •๋ณด, Depth์ •๋ณด) ๋™๊ธฐํ™” ๋ฐ ํ…Œ์ŠคํŠธ ์ด๋ฏธ์ง€ ์ถœ๋ ฅ

๐ŸŸข ์ƒ‰์ƒ์ •๋ณด์™€ ๊นŠ์ด์ •๋ณด๋Š” ์ด๋ฏธ์ง€์— ๋™๊ธฐํ™”๋˜์–ด์„œ ์ €์žฅ๋˜๋Š”๊ฒƒ์ด ์•„๋‹ˆ๋ผ, ๋ถ„๋ฆฌ๋˜์–ด ์ €์žฅ๋˜์–ด์žˆ๋‹ค. Timestamp ๊ธฐ์ค€์œผ๋กœ ์ด๋ฏธ์ง€์™€ (์ƒ‰์ƒ,๊นŠ์ด) ์ •๋ณด๋ฅผ ๋™๊ธฐํ™” ์‹œ์ผœ์ฃผ์ž.

% Load time stamp data of color images
timeColor = helperImportTimestampFile([imageFolder, 'rgb.txt']);

% Load time stamp data of depth images
timeDepth = helperImportTimestampFile([imageFolder, 'depth.txt']);

% Align the time stamp
indexPairs = helperAlignTimestamp(timeColor, timeDepth);

% Select the synchronized image data
imdsColor     = subset(imdsColor, indexPairs(:, 1));
imdsDepth     = subset(imdsDepth, indexPairs(:, 2));

% Inspect the first RGB-D image
currFrameIdx  = 1;
currIcolor    = readimage(imdsColor, currFrameIdx);
currIdepth    = readimage(imdsDepth, currFrameIdx);
imshowpair(currIcolor, currIdepth, "montage");





3.Map initialize
๐ŸŸขMap์—์„œ SLAM์„ ํ•˜๊ธฐ์œ„ํ•œ ์ดˆ๊ธฐ ์„ค์ •์œ„์น˜์ด๋‹ค. ํŠน์ง•์  ์ถ”์ถœ๊ณผ ๊ด€๋ จ๋œ ํŒŒ๋ผ๋ฏธํ„ฐ๋“ค(scaleFactor, numLevels, numpoints๋“ฑ)์„ ์„ค์ •ํ•œ๋‹ค.

% Set random seed for reproducibility
rng(0);

% Create a cameraIntrinsics object to store the camera intrinsic parameters.
% The intrinsics for the dataset can be found at the following page:
% https://vision.in.tum.de/data/datasets/rgbd-dataset/file_formats
focalLength    = [535.4, 539.2];    % in units of pixels
principalPoint = [320.1, 247.6];    % in units of pixels
imageSize      = size(currIcolor,[1,2]); % in pixels [mrows, ncols]
depthFactor    = 5e3;
intrinsics     = cameraIntrinsics(focalLength,principalPoint,imageSize);

% Detect and extract ORB features from the color image
scaleFactor = 1.2;
numLevels   = 8;
numPoints = 1000;
[currFeatures, currPoints] = helperDetectAndExtractFeatures(currIcolor, scaleFactor, numLevels, numPoints); 

initialPose = rigidtform3d();
[xyzPoints, validIndex] = helperReconstructFromRGBD(currPoints, currIdepth, intrinsics, initialPose, depthFactor);





4.Loop closure ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค ๊ตฌ์ถ•
๐ŸŸข๊ฐ™์€ ์œ„์น˜์— ๋˜๋Œ์•„์™”์„๋•Œ, ์ƒˆ๋กœ์šด๊ณณ์ด๋ผ๊ณ  ์ธ์ง€ํ•˜์ง€ ์•Š์•„์•ผ SLAM์ด ์ •ํ™•ํ•˜๋‹ค. ์ด๋Š” bof๋ผ๋Š” ์‹œ๊ฐ์ ์–ดํœ˜ ๊ธฐ๋ฒ•(๋น„์œ : ์ด๊ณณ์˜ ๋‚˜๋ฌด,์Œ์‹์ ๋“ฑ ํŠน์ง•์ด์žˆ๋Š” ํŠน์ง•์ ๋“ค์„ ๊ธฐ์–ตํ•˜๊ณ  ๋ฉ”๋ชจํ•ด๋‘์ž!)์„ ์ด์šฉํ•œ๋‹ค.
์ด bof์ •๋ณด๋“ค์„ ์ €์žฅํ•  ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค๋ฅผ ๊ตฌ์ถ•ํ•˜๋Š” ๊ณผ์ •์ด๋‹ค.

% Load the bag of features data created offline
bofData         = load("bagOfFeaturesDataSLAM.mat");

% Initialize the place recognition database
loopDatabase    = invertedImageIndex(bofData.bof, SaveFeatureLocations=false);

% Add features of the first key frame to the database
currKeyFrameId = 1;
addImageFeatures(loopDatabase, currFeatures, currKeyFrameId);





5.3D Map point ๋ฐ RGB์ด๋ฏธ์ง€๋ฅผ ์ €์žฅํ•  ๋””๋ ‰ํ† ๋ฆฌ ๊ตฌ์ถ• ๋ฐ SLAMํ…Œ์ŠคํŠธ
๐ŸŸข3D Map point, keyFrame์„ ์ €์žฅํ•  ๋””๋ ‰ํ† ๋ฆฌ๋ฅผ ๋งŒ๋“ค์–ด์ค€๋‹ค.
Keyframe์—์„œ 3D Map point๋ฅผ ์ถ”์ถœํ•˜์—ฌ ์ €์žฅํ•œ๋‹ค.(Keyframe๋„ ์ €์žฅ)
ํŠน์ • 3D Map point ์ ๊ตฐ๋งˆ๋‹ค ๊นŠ์ด์ •๋ณด ๋ฐ ์นด๋ฉ”๋ผ๊ฐ€ ๋ฐ”๋ผ๋ณด๋Š” ๋ฐฉํ–ฅ๋„ ์ €์žฅํ•œ๋‹ค.
ํ…Œ์ŠคํŠธ์šฉ์œผ๋กœ ์ฒซ ํ‚คํ”„๋ ˆ์ž„์˜ 3D SLAM ๊ฒฐ๊ณผ๋ฅผ ์‹œ๊ฐํ™”ํ•ด๋ณธ๋‹ค.

3D Mapoint : SLAM๊ฒฐ๊ณผ๋กœ์„œ ๋ณด์ด๋Š” ์ ๋“ค์˜ ์ง‘ํ•ฉ
Key Frame : ํŠน์ง•์ ์ด ๊ฐ€์žฅ ๋งŽ์ด ๋ณด์ด๋Š” RGB์ด๋ฏธ์ง€. SLAM์—์„œ ์ฐธ๊ณ ํ•˜๋Š” ์ด๋ฏธ์ง€๋“ค

% Create an empty imageviewset object to store key frames
vSetKeyFrames = imageviewset;

% Create an empty worldpointset object to store 3-D map points
mapPointSet   = worldpointset;

% Add the first key frame
vSetKeyFrames = addView(vSetKeyFrames, currKeyFrameId, initialPose, Points=currPoints,...
    Features=currFeatures.Features);

% Add 3-D map points
[mapPointSet, rgbdMapPointsIdx] = addWorldPoints(mapPointSet, xyzPoints);

% Add observations of the map points
mapPointSet = addCorrespondences(mapPointSet, currKeyFrameId, rgbdMapPointsIdx, validIndex);

% Update view direction and depth
mapPointSet = updateLimitsAndDirection(mapPointSet, rgbdMapPointsIdx, vSetKeyFrames.Views);

% Update representative view
mapPointSet = updateRepresentativeView(mapPointSet, rgbdMapPointsIdx, vSetKeyFrames.Views);

% Visualize matched features in the first key frame
featurePlot = helperVisualizeMatchedFeaturesRGBD(currIcolor, currIdepth, currPoints(validIndex));

% Visualize initial map points and camera trajectory
xLim = [-4 4];
yLim = [-3 1];
zLim = [-1 6];
mapPlot  = helperVisualizeMotionAndStructure(vSetKeyFrames, mapPointSet, xLim, yLim, zLim);

% Show legend
showLegend(mapPlot);





6.Tracking
๐ŸŸข์นด๋ฉ”๋ผ๊ฐ€ ์›€์ง์ด๋ฉด์„œ ์ƒˆ๋กœ์šด ์‚ฌ์ง„์ด ์ฐํžˆ๋ฉด, ์ด์ „ ํ‚คํ”„๋ ˆ์ž„์— ์žˆ๋˜ ํŠน์ง•์ ๋“ค๊ณผ ๋น„๊ตํ•˜๋ฉฐ, ์นด๋ฉ”๋ผ์˜ ์ƒ๋Œ€์œ„์น˜์™€ ํฌ์ฆˆ(orientation)์„ ์ถ”์ •ํ•œ๋‹ค.
๋˜ํ•œ, ์ถ”๊ฐ€์ ์œผ๋กœ ํŠน์ง•์ ๋“ค์„ ํŒŒ์•…ํ•˜์—ฌ ๊ฒฐ๊ณผ๊ฐ’์„ ๋ฏธ์„ธ์กฐ์ •๋„ ํ•œ๋‹ค.

Kanade-Lucas-Tomasi(vision.PointTracker) : ํŠน์ง•์ ์˜ ๋ณ€ํ™”๋กœ ์นด๋ฉ”๋ผ ์ด๋™ ์ถ”์ •ํ•˜๋Š” ์•Œ๊ณ ๋ฆฌ์ฆ˜
PnP : ์„œ๋กœ ๋Œ€์‘๋˜๋Š” (2D ์ขŒํ‘œ - 3D์ขŒํ‘œ) ์Œ์„ ์ด์šฉํ•˜์—ฌ ์นด๋ฉ”๋ผ์˜ ์œ„์น˜์™€ ๋ฐฉํ–ฅ(ํฌ์ฆˆ) ์ถ”์ •
BundleAdjustmentMotion: SLAM๊ฒฐ๊ณผ๊ฐ’ ๋ณด์ •

% ViewId of the last key frame
lastKeyFrameId    = currKeyFrameId;

% Index of the last key frame in the input image sequence
lastKeyFrameIdx   = currFrameIdx; 

% Indices of all the key frames in the input image sequence
addedFramesIdx    = lastKeyFrameIdx;

currFrameIdx      = 2;
isLoopClosed      = false;

% Main loop
isLastFrameKeyFrame = true;

% Create and initialize the KLT tracker
tracker = vision.PointTracker(MaxBidirectionalError = 5);
initialize(tracker, currPoints.Location(validIndex, :), currIcolor);

while currFrameIdx < numel(imdsColor.Files)

    currIcolor = readimage(imdsColor, currFrameIdx);
    currIdepth = readimage(imdsDepth, currFrameIdx);

    [currFeatures, currPoints]    = helperDetectAndExtractFeatures(currIcolor, scaleFactor, numLevels, numPoints);

    % Track the last key frame
    % trackedMapPointsIdx:  Indices of the map points observed in the current left frame 
    % trackedFeatureIdx:    Indices of the corresponding feature points in the current left frame
    [currPose, trackedMapPointsIdx, trackedFeatureIdx] = helperTrackLastKeyFrameKLT(tracker, currIcolor, mapPointSet, ...
        vSetKeyFrames.Views, currFeatures, currPoints, lastKeyFrameId, intrinsics);
    
    if isempty(currPose) || numel(trackedMapPointsIdx) < 20
        currFrameIdx = currFrameIdx + 1;
        continue
    end
    
    % Track the local map and check if the current frame is a key frame.
    % A frame is a key frame if both of the following conditions are satisfied:
    %
    % 1. At least 20 frames have passed since the last key frame or the 
    %    current frame tracks fewer than 120 map points. 
    % 2. The map points tracked by the current frame are fewer than 90% of 
    %    points tracked by the reference key frame.
    %
    % localKeyFrameIds:   ViewId of the connected key frames of the current frame
    numSkipFrames     = 20;
    numPointsKeyFrame = 120;
    [localKeyFrameIds, currPose, trackedMapPointsIdx, trackedFeatureIdx, isKeyFrame] = ...
        helperTrackLocalMap(mapPointSet, vSetKeyFrames, trackedMapPointsIdx, ...
        trackedFeatureIdx, currPose, currFeatures, currPoints, intrinsics, scaleFactor, ...
        isLastFrameKeyFrame, lastKeyFrameIdx, currFrameIdx, numSkipFrames, numPointsKeyFrame);

    % Visualize matched features
    updatePlot(featurePlot, currIcolor, currIdepth, currPoints(trackedFeatureIdx));
    
    if ~isKeyFrame
        currFrameIdx = currFrameIdx + 1;
        isLastFrameKeyFrame = false;
        continue
    else
        % Match feature points between the stereo images and get the 3-D world positions
        [xyzPoints, validIndex] = helperReconstructFromRGBD(currPoints, currIdepth, ...
            intrinsics, currPose, depthFactor);

        [untrackedFeatureIdx, ia] = setdiff(validIndex, trackedFeatureIdx);
        xyzPoints = xyzPoints(ia, :);
        isLastFrameKeyFrame = true;
    end

    % Update current key frame ID
    currKeyFrameId  = currKeyFrameId + 1;





7.Local Mapping
๐ŸŸขMapping(Map point ์ƒ์„ฑ)์„ ์ˆ˜ํ–‰ํ•˜๋Š” ๊ธฐ์ค€์„ ์„ค๋ช…. Mapping์€ ์ƒˆ๋กœ์šด ํ‚คํ”„๋ ˆ์ž„์ด ์ถ”๊ฐ€๋ ๋•Œ๋งˆ๋‹ค ์ˆ˜ํ–‰๋จ. ์ƒˆ๋กœ์šด ํ‚ค ํ”„๋ ˆ์ž„์€ ์ด์ „์˜ ํ‚คํ”„๋ ˆ์ž„์˜ ํŠน์ง•์ ๊ณผ ํ˜„์žฌ ์ด๋ฏธ์ง€์˜ ํŠน์ง•์  ์‚ฌ์ด์˜ ์ฐจ์ด๊ฐ€ ์ปค์ง€๋ฉด ์ƒ์„ฑ์ด๋จ.

   % Add the new key frame    
    [mapPointSet, vSetKeyFrames] = helperAddNewKeyFrame(mapPointSet, vSetKeyFrames, ...
        currPose, currFeatures, currPoints, trackedMapPointsIdx, trackedFeatureIdx, localKeyFrameIds);
        
    % Remove outlier map points that are observed in fewer than 3 key frames
    if currKeyFrameId == 2
        triangulatedMapPointsIdx = [];
    end
    
    [mapPointSet, trackedMapPointsIdx] = ...
        helperCullRecentMapPointsRGBD(mapPointSet, trackedMapPointsIdx, triangulatedMapPointsIdx, ...
        rgbdMapPointsIdx);
    
    % Add new map points computed from disparity 
    [mapPointSet, rgbdMapPointsIdx] = addWorldPoints(mapPointSet, xyzPoints);
    mapPointSet = addCorrespondences(mapPointSet, currKeyFrameId, rgbdMapPointsIdx, ...
        untrackedFeatureIdx);
    
    % Create new map points by triangulation
    minNumMatches = 10;
    minParallax   = 1;
    [mapPointSet, vSetKeyFrames, triangulatedMapPointsIdx, rgbdMapPointsIdx] = helperCreateNewMapPointsStereo( ...
        mapPointSet, vSetKeyFrames, currKeyFrameId, intrinsics, scaleFactor, minNumMatches, minParallax, ...
        untrackedFeatureIdx, rgbdMapPointsIdx);

    % Update view direction and depth
    mapPointSet = updateLimitsAndDirection(mapPointSet, [triangulatedMapPointsIdx; rgbdMapPointsIdx], ...
        vSetKeyFrames.Views);

    % Update representative view
    mapPointSet = updateRepresentativeView(mapPointSet, [triangulatedMapPointsIdx; rgbdMapPointsIdx], ...
        vSetKeyFrames.Views);

    % Local bundle adjustment
    [mapPointSet, vSetKeyFrames, triangulatedMapPointsIdx, rgbdMapPointsIdx] = ...
        helperLocalBundleAdjustmentStereo(mapPointSet, vSetKeyFrames, ...
        currKeyFrameId, intrinsics, triangulatedMapPointsIdx, rgbdMapPointsIdx);

    % Visualize 3-D world points and camera trajectory
    updatePlot(mapPlot, vSetKeyFrames, mapPointSet);

    % Set the feature points to be tracked
    [~, index2d] = findWorldPointsInView(mapPointSet, currKeyFrameId);
    setPoints(tracker, currPoints.Location(index2d, :));





8.Loop Closure
๐ŸŸข๋ฐฉ๋ฌธํ–ˆ๋˜ ์œ„์น˜๋กœ ๋Œ์•„์™”์„๋•Œ, ์ƒˆ๋กœ์šด ์œ„์น˜๋กœ ์ธ์‹ํ•˜์ง€ ์•Š๋„๋ก, ๋ฐ์ดํ„ฐ๋ฒ ์ด์Šค์—์„œ ํ™•์ธํ•ด์„œ SLAM๊ฒฐ๊ณผ๊ฐ’ ๋ณด์ •.
์ธ์ ‘ํ•œ ํ”„๋ ˆ์ž„๊ฐ„ ๋น„๋งค์นญ๋œ ํฌ์ธํŠธ๋“ค๋„ ์—ฐ๊ฒฐํ•˜์—ฌ SLAM ๊ณผ์ •์— ์ด์šฉ.

๋งค์นญ๋œ ํฌ์ธํŠธ : ํ‚คํ”„๋ ˆ์ž„ ์•ž๋’ค๋กœ 1๊ฐœ์˜ ํ”„๋ ˆ์ž„(์ด 3๊ฐœ์˜ ํ”„๋ ˆ์ž„๊ทธ๋ฃน)์—์„œ ํŠน์ง•์ ์œ„์น˜๊ฐ€ ํฌ๊ฒŒ ๋ณ€ํ•˜์ง€ ์•Š์€๊ฒƒ๋“ค์€ ๋งค์นญ๋œ ํฌ์ธํŠธ์ด๋‹ค.

   % Check loop closure after some key frames have been created    
    if currKeyFrameId > 20
        
        % Minimum number of feature matches of loop edges
        loopEdgeNumMatches = 120;
        
        % Detect possible loop closure key frame candidates
        [isDetected, validLoopCandidates] = helperCheckLoopClosureT(vSetKeyFrames, currKeyFrameId, ...
            loopDatabase, currIcolor, loopEdgeNumMatches);
        
        if isDetected 
            % Add loop closure connections
            maxDistance = 0.1;
            [isLoopClosed, mapPointSet, vSetKeyFrames] = helperAddLoopConnectionsStereo(...
                mapPointSet, vSetKeyFrames, validLoopCandidates, currKeyFrameId, ...
                currFeatures, currPoints, loopEdgeNumMatches, maxDistance);
        end
    end
    
    % If no loop closure is detected, add current features into the database
    if ~isLoopClosed
        addImageFeatures(loopDatabase,  currFeatures, currKeyFrameId);
    end
    
    % Update IDs and indices
    lastKeyFrameId  = currKeyFrameId;
    lastKeyFrameIdx = currFrameIdx;
    addedFramesIdx  = [addedFramesIdx; currFrameIdx]; %#ok<AGROW>
    currFrameIdx    = currFrameIdx + 1;
end % End of main loop





9.Optimization of Pose Graph
๐ŸŸข์ตœ์†Œ ๋งค์นญ ํฌ์ธํŠธ์ˆ˜ (minNumMatches) ๋ณด๋‹ค ์ ์€ ๋งค์นญ์„ ๊ฐ€์ง„ ํ”„๋ ˆ์ž„์€ ์ œ๊ฑฐํ•˜์—ฌ ํ•„์ˆ˜ ๊ทธ๋ž˜ํ”„(๋…ธ์ด์ฆˆ์ œ๊ฑฐํ•˜๋Š” ๊ณผ์ •)
์ด ํ•„์ˆ˜๊ทธ๋ž˜ํ”„๋ฅผ ๊ธฐ๋ฐ˜์œผ๋กœ ํ•˜์—ฌ SLAM๊ฒฐ๊ณผ๋ฅผ ๋ฏธ์„ธํ•˜๊ฒŒ ์กฐ์ •.
์ตœ์ ํ™”๋œ 3D๋งตํฌ์ธํŠธ์˜ ์œ„์น˜๋ฅผ ์—…๋ฐ์ดํŠธํ•จ.(์‹ค์ œ SLAM๊ณผ์ •์—์„œ ๋งจ์ฒ˜์Œ์—๋Š” ํŠน์ง•์  ๋งŽ์ด ์ฐํžˆ๋‹ค๊ฐ€ OPG๊ณผ์ •์„ ํ†ตํ•ด ํ•ต์‹ฌ ํŠน์ง•์ ๋งŒ ๋‚จ๊ฒŒ๋จ.)

% Optimize the poses
minNumMatches      = 50;
vSetKeyFramesOptim = optimizePoses(vSetKeyFrames, minNumMatches, Tolerance=1e-16);

% Update map points after optimizing the poses
mapPointSet = helperUpdateGlobalMap(mapPointSet, vSetKeyFrames, vSetKeyFramesOptim);

updatePlot(mapPlot, vSetKeyFrames, mapPointSet);

% Plot the optimized camera trajectory
optimizedPoses  = poses(vSetKeyFramesOptim);
plotOptimizedTrajectory(mapPlot, optimizedPoses)

% Update legend
showLegend(mapPlot);





10. SLAM๊ฒฐ๊ณผ






11. GroundTruth์™€ SLAM๊ฒฐ๊ณผ ๋น„๊ต
๐ŸŸขํ”„๋ ˆ์ž„๋งˆ๋‹ค ์นด๋ฉ”๋ผ์˜ ์œ„์น˜์™€ ๋ฐฉํ–ฅ ์ •๋ณด๋ฅผ ๋‹ด์€ ์‹ค์ œ ์ •๋ณด์™€, SLAM์„ ํ†ตํ•ด ์ถ”์ •๋œ ์นด๋ฉ”๋ผ์˜ ์œ„์น˜, ๋ฐฉํ–ฅ์ •๋ณด๋ฅผ ๋น„๊ตํ•˜์—ฌ ์•Œ๊ณ ๋ฆฌ์ฆ˜์˜ ์ •ํ™•๋„๋ฅผ ํ‰๊ฐ€ํ•œ๋‹ค.

% Load ground truth
gTruthData = load("orbslamGroundTruth.mat");
gTruth     = gTruthData.gTruth;

% Plot the actual camera trajectory
plotActualTrajectory(mapPlot, gTruth(indexPairs(addedFramesIdx, 1)), optimizedPoses);

% Show legend
showLegend(mapPlot);





12. Dense Reconstruction
๐ŸŸข์นด๋ฉ”๋ผ ๋˜๋Š” ์„ผ์„œ๋กœ ์บก์ฒ˜ํ•œ ์ด๋ฏธ์ง€๋ฅผ ์‚ฌ์šฉํ•˜์—ฌ ๊ณ ๋ฐ€๋„ 3D ๋ชจ๋ธ์„ ์ƒ์„ฑํ•˜๋Š” ๊ณผ์ •. ์ด ๊ณผ์ •์€ ํ™˜๊ฒฝ์˜ ์„ธ๋ฐ€ํ•œ 3D ๊ตฌ์กฐ๋ฅผ ์žฌ๊ตฌ์„ฑํ•˜๋Š” ๋ฐ ์ค‘์ ์„ ๋‘๋ฉฐ, ํ‘œ๋ฉด์˜ ๋ชจ๋“  ์ž‘์€ ์„ธ๋ถ€ ์‚ฌํ•ญ์„ ํฌํ•จ. SLAM์œผ๋กœ ์ƒ์„ฑ๋œ ์ •๋ณด๋“ค์„ ๊ธฐ๋ฐ˜์œผ๋กœ ๊ณต๊ฐ„์„ ์šฐ๋ฆฌ๊ฐ€ ๋ณผ ์ˆ˜ ์žˆ๋Š” ํ˜•ํƒœ๋กœ ์žฌ๊ตฌ์„ฑํ•˜๋Š” ๋‹จ๊ณ„

% Create an array of pointCloud objects to store the world points constructed
% from the key frames
ptClouds =  repmat(pointCloud(zeros(1, 3)), numel(addedFramesIdx), 1);

% Ignore image points at the boundary 
offset = 40;
[X, Y] = meshgrid(offset:2:imageSize(2)-offset, offset:2:imageSize(1)-offset);

for i = 1: numel(addedFramesIdx)
    Icolor = readimage(imdsColor, addedFramesIdx(i));
    Idepth = readimage(imdsDepth, addedFramesIdx(i));

    [xyzPoints, validIndex] = helperReconstructFromRGBD([X(:), Y(:)], ...
        Idepth, intrinsics, optimizedPoses.AbsolutePose(i), depthFactor);

    colors = zeros(numel(X), 1, 'like', Icolor);
    for j = 1:numel(X)
        colors(j, 1:3) = Icolor(Y(j), X(j), :);
    end
    ptClouds(i) = pointCloud(xyzPoints, Color=colors(validIndex, :));
end

% Concatenate the point clouds
pointCloudsAll = pccat(ptClouds);

figure
pcshow(pointCloudsAll,VerticalAxis="y", VerticalAxisDir="down");
xlabel('X')
ylabel('Y')
zlabel('Z')





๐Ÿ’ก๋งˆ๋ฌด๋ฆฌ

RGB-D (DepthCamera๋ฅผ ํ†ตํ•œ 3D SLAM ๊ณผ์ •์„ ์•Œ๊ฒŒ๋˜์—ˆ๋‹ค.)

์•Œ์•„๋ณธ ๊ฒฐ๊ณผ, ์นด๋ฉ”๋ผ์˜ ์ด๋™ ๋ฐ ์ž์„ธ์ถ”์ •๋‹จ๊ณ„(Tracking)์—์„œ ๋‹จ์ˆœ ํŠน์ง•์  ์‚ฌ์ด ์ด๋™๋งŒ์„ ์ด์šฉํ•˜๋Š”๊ฒƒ์ด ์•„๋‹Œ, IMU๊ฐ€ ์—ฌ๊ธฐ์„œ ํ“จ์ „๋˜๋ฉด ํ›จ์”ฌ ์ •ํ™•ํ•œ ์œ„์น˜ ๋ฐ ์ž์„ธ์ถ”์ •์ด ๋œ๋‹ค๋Š” ์‚ฌ์‹ค๋„ ์ฐพ์•„๋ณด๋‹ˆ ์•Œ๊ฒŒ๋˜์—ˆ๋‹ค.
(โ—โ—ํ˜„์žฌ ํฌ์ŠคํŠธ์—์„œ๋Š” IMU๋ฅผ ์ด์šฉํ•˜์ง€ ์•Š๊ณ ์žˆ๋‹คโ—โ—)

์ถ”ํ›„์— 3D์นด๋ฉ”๋ผ์™€ IMU๋ฅผ ์„ผ์„œํ“จ์ „ํ•˜์—ฌ SLAMํ•˜๋Š” ๋ฐฉ์‹์„ Matlab์„ ํ†ตํ•ด ํ•ด๋ด์•ผ๊ฒ ๋‹ค.

๋ฐ์ดํ„ฐ์…‹์„ ์šฐ๋ฆฌ๊ป„๋กœ ๋ฐ”๊ฟ”์ฃผ๋ฉด, ๋‚ด๊ฐ€ ์›ํ•˜๋Š” ๊ณณ์˜ SLAM์„ ํ•ด๋ณผ์ˆ˜ ์žˆ๊ฒ ๋‹ค๋Š” ์ƒ๊ฐ์ด ๋“ค์—ˆ๋‹ค.

profile
School of Electronic Engineering. KNU

0๊ฐœ์˜ ๋Œ“๊ธ€