最近在github上看到了此套系统,看简介说是基于LOAM和ALOAM改进而来,计算时间缩小3倍,精度也有一定提高。本系统只是小提升,并没有相关论文发表。作者加入了一些场景识别的改进,发表在2020年ICRA上,叫ISCLOAM,代码也开源到github,后续单独写一篇进行记录。
我在kitti上跑了一下:
红色的表示ground_truth,绿色的是odom轨迹。
项目地址如下:https://github.com/wh200720041/floam
我的笔记注释如下:https://github.com/JokerJohn/opensource_slam_noted
简介 FLOAM是基于LOAM和ALOAM的修改版,其主要的原理和流程没有变化。首先是点云预处理、然后提取边缘和平面特征,分别进行匹配估计位姿,最后位姿融合。当然这里还提供了mapping和localization的模块,但mapping部分只是将估计的位姿,拼接点云,并没有后端优化等模块,所以不是关键。下面将从代码层面来进行记录。
代码结构 从代码结构来看,主要有以下cpp文件
lidar.cpp: 系统的参数配置,比如雷达线束数量、扫描周期等
laserProcessingClass.cpp、laserProcessingNode.cpp:点云预处理节点、包括边缘、平面特征提取
odomEstimationNode.cpp、odomEstimationClass.cpp:位姿估计节点,根据边缘点、平面点特征,构建点线、点面残差,利用ceres联合滑窗优化,估计位姿。
laserMappingNode.cpp、laserMappingClass.cpp:建图节点,根据odom信息拼接点云
lidarOptimization.cpp:odom评价
特征提取 这里主要订阅了点云数据,进行初步过滤,并提取边缘和平面特征,其中callback中只是用一个queue队列来装相关数据,所有的业务逻辑用单独的一个线程来处理。
1 2 3 4 5 6 7 8 9 10 11 ros::Subscriber subLaserCloud = nh.subscribe<sensor_msgs::PointCloud2>("/velodyne_points" , 100 , velodyneHandler); pubLaserCloudFiltered = nh.advertise<sensor_msgs::PointCloud2>("/velodyne_points_filtered" , 100 ); pubEdgePoints = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_edge" , 100 ); pubSurfPoints = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_surf" , 100 ); std ::thread laser_processing_process{laser_processing};
laser_processing_process线程中循环执行,取点云数据->提取边缘和平面特征->发布相关点云。首先是取点云数据
1 2 3 4 5 6 7 8 mutex_lock.lock(); pcl::PointCloud<pcl::PointXYZI>::Ptr pointcloud_in (new pcl::PointCloud<pcl::PointXYZI>()) ; pcl::fromROSMsg(*pointCloudBuf.front(), *pointcloud_in); ros::Time pointcloud_time = (pointCloudBuf.front())->header.stamp; pointCloudBuf.pop(); mutex_lock.unlock();
特征提取,首先计算点云线束并分类,计算曲率,并将点云划分6个子图,每个子图分别提取特征,保证各方向特征提取均匀。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 int N_SCANS = lidar_param.num_lines;std ::vector <pcl::PointCloud<pcl::PointXYZI>::Ptr> laserCloudScans; for (int i = 0 ; i < N_SCANS; i++) { laserCloudScans.push_back(pcl::PointCloud<pcl::PointXYZI>::Ptr(new pcl::PointCloud<pcl::PointXYZI>())); } for (int i = 0 ; i < (int ) pc_in->points.size(); i++) { int scanID = 0 ; double distance = sqrt (pc_in->points[i].x * pc_in->points[i].x + pc_in->points[i].y * pc_in->points[i].y); if (distance < lidar_param.min_distance || distance > lidar_param.max_distance) continue ; double angle = atan (pc_in->points[i].z / distance) * 180 / M_PI; if (N_SCANS == 16 ) { scanID = int ((angle + 15 ) / 2 + 0.5 ); if (scanID > (N_SCANS - 1 ) || scanID < 0 ) { continue ; } } else if (N_SCANS == 32 ) { scanID = int ((angle + 92.0 / 3.0 ) * 3.0 / 4.0 ); if (scanID > (N_SCANS - 1 ) || scanID < 0 ) { continue ; } } else if (N_SCANS == 64 ) { if (angle >= -8.83 ) scanID = int ((2 - angle) * 3.0 + 0.5 ); else scanID = N_SCANS / 2 + int ((-8.83 - angle) * 2.0 + 0.5 ); if (angle > 2 || angle < -24.33 || scanID > 63 || scanID < 0 ) { continue ; } } else { printf ("wrong scan number\n" ); } laserCloudScans[scanID]->push_back(pc_in->points[i]); }
计算每个点的曲率,参考LOAM
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 for (int i = 0 ; i < N_SCANS; i++) { if (laserCloudScans[i]->points.size() < 131 ) { continue ; } std ::vector <Double2d> cloudCurvature; int total_points = laserCloudScans[i]->points.size() - 10 ; for (int j = 5 ; j < (int ) laserCloudScans[i]->points.size() - 5 ; j++) { double diffX = laserCloudScans[i]->points[j - 5 ].x + laserCloudScans[i]->points[j - 4 ].x + laserCloudScans[i]->points[j - 3 ].x + laserCloudScans[i]->points[j - 2 ].x + laserCloudScans[i]->points[j - 1 ].x - 10 * laserCloudScans[i]->points[j].x + laserCloudScans[i]->points[j + 1 ].x + laserCloudScans[i]->points[j + 2 ].x + laserCloudScans[i]->points[j + 3 ].x + laserCloudScans[i]->points[j + 4 ].x + laserCloudScans[i]->points[j + 5 ].x; double diffY = laserCloudScans[i]->points[j - 5 ].y + laserCloudScans[i]->points[j - 4 ].y + laserCloudScans[i]->points[j - 3 ].y + laserCloudScans[i]->points[j - 2 ].y + laserCloudScans[i]->points[j - 1 ].y - 10 * laserCloudScans[i]->points[j].y + laserCloudScans[i]->points[j + 1 ].y + laserCloudScans[i]->points[j + 2 ].y + laserCloudScans[i]->points[j + 3 ].y + laserCloudScans[i]->points[j + 4 ].y + laserCloudScans[i]->points[j + 5 ].y; double diffZ = laserCloudScans[i]->points[j - 5 ].z + laserCloudScans[i]->points[j - 4 ].z + laserCloudScans[i]->points[j - 3 ].z + laserCloudScans[i]->points[j - 2 ].z + laserCloudScans[i]->points[j - 1 ].z - 10 * laserCloudScans[i]->points[j].z + laserCloudScans[i]->points[j + 1 ].z + laserCloudScans[i]->points[j + 2 ].z + laserCloudScans[i]->points[j + 3 ].z + laserCloudScans[i]->points[j + 4 ].z + laserCloudScans[i]->points[j + 5 ].z; Double2d distance (j, diffX * diffX + diffY * diffY + diffZ * diffZ) ; cloudCurvature.push_back(distance); }
点云均匀划分
1 2 3 4 5 6 7 8 9 10 11 12 13 for (int j = 0 ; j < 6 ; j++) { int sector_length = (int ) (total_points / 6 ); int sector_start = sector_length * j; int sector_end = sector_length * (j + 1 ) - 1 ; if (j == 5 ) { sector_end = total_points - 1 ; } std ::vector <Double2d> subCloudCurvature(cloudCurvature.begin() + sector_start, cloudCurvature.begin() + sector_end); featureExtractionFromSector(laserCloudScans[i], subCloudCurvature, pc_out_edge, pc_out_surf); }
提取特征
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 // 将点云按曲率大小进行排序 std::sort(cloudCurvature.begin(), cloudCurvature.end(), [](const Double2d &a, const Double2d &b) { return a.value < b.value; }); int largestPickedNum = 0; std::vector<int> picked_points; int point_info_count = 0; // 遍历点云,根据曲率判断其是平面点还是边缘点 for (int i = cloudCurvature.size() - 1; i >= 0; i--) { int ind = cloudCurvature[i].id; if (std::find(picked_points.begin(), picked_points.end(), ind) == picked_points.end()) { if (cloudCurvature[i].value <= 0.1) { // 曲率太小的,去除 break; } largestPickedNum++; picked_points.push_back(ind); // 找到曲率最大的20个点 if (largestPickedNum <= 20) { pc_out_edge->push_back(pc_in->points[ind]); // 认为是边缘点 point_info_count++; } else { break; } // 计算此点前后5个点的(曲率?) for (int k = 1; k <= 5; k++) { double diffX = pc_in->points[ind + k].x - pc_in->points[ind + k - 1].x; double diffY = pc_in->points[ind + k].y - pc_in->points[ind + k - 1].y; double diffZ = pc_in->points[ind + k].z - pc_in->points[ind + k - 1].z; if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05) { break; } picked_points.push_back(ind + k); } for (int k = -1; k >= -5; k--) { double diffX = pc_in->points[ind + k].x - pc_in->points[ind + k + 1].x; double diffY = pc_in->points[ind + k].y - pc_in->points[ind + k + 1].y; double diffZ = pc_in->points[ind + k].z - pc_in->points[ind + k + 1].z; if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05) { break; } picked_points.push_back(ind + k); } } } // 平面点 for (int i = 0; i <= (int) cloudCurvature.size() - 1; i++) { int ind = cloudCurvature[i].id; if (std::find(picked_points.begin(), picked_points.end(), ind) == picked_points.end()) { pc_out_surf->push_back(pc_in->points[ind]); } }
位姿估计 位姿估计节点主要接收边缘和平面特征,分别构建点到直线,点到平面的残差项,并通过ceres求解旋转四元数q和位移t。说实话,我没看出来和LOAM/ALOAM有啥区别。
首先是时间戳对齐,去掉较老时间戳的点云。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 mutex_lock.lock(); if (!pointCloudBuf.empty() && (pointCloudBuf.front()->header.stamp.toSec()<pointCloudSurfBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period || pointCloudBuf.front()->header.stamp.toSec()<pointCloudEdgeBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period)){ ROS_WARN("time stamp unaligned error and odom discarded, pls check your data --> odom correction" ); pointCloudBuf.pop(); mutex_lock.unlock(); continue ; } if (!pointCloudSurfBuf.empty() && (pointCloudSurfBuf.front()->header.stamp.toSec()<pointCloudBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period || pointCloudSurfBuf.front()->header.stamp.toSec()<pointCloudEdgeBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period)){ pointCloudSurfBuf.pop(); ROS_INFO("time stamp unaligned with extra point cloud, pls check your data --> odom correction" ); mutex_lock.unlock(); continue ; } if (!pointCloudEdgeBuf.empty() && (pointCloudEdgeBuf.front()->header.stamp.toSec()<pointCloudBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period || pointCloudEdgeBuf.front()->header.stamp.toSec()<pointCloudSurfBuf.front()->header.stamp.toSec()-0.5 *lidar_param.scan_period)){ pointCloudEdgeBuf.pop(); ROS_INFO("time stamp unaligned with extra point cloud, pls check your data --> odom correction" ); mutex_lock.unlock(); continue ; }
然后用一个滑窗来维护corner和surface特征的local map,第一帧点云直接加入local map进行初始化。后续点云帧进来就构建约束。主流程如下
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 if (is_odom_inited == false ){ odomEstimation.initMapWithPoints(pointcloud_edge_in, pointcloud_surf_in); is_odom_inited = true ; ROS_INFO("odom inited" ); }else { std ::chrono::time_point<std ::chrono::system_clock> start, end; start = std ::chrono::system_clock::now(); odomEstimation.updatePointsToMap(pointcloud_edge_in, pointcloud_surf_in); end = std ::chrono::system_clock::now(); std ::chrono::duration<float > elapsed_seconds = end - start; total_frame++; float time_temp = elapsed_seconds.count() * 1000 ; total_time+=time_temp; ROS_INFO("average odom estimation time %f ms \n \n" , total_time/total_frame); }
构建点到直线的约束。寻找最近的5个点,对点云协方差矩阵进行主成份分析: 若这5个点分布在直线上,协方差矩阵的特征值包含一个元素显著大于其余两个,与该特征值相关的特征向量表示所处直线的方向; 若这5个点分布在平面上,协方差矩阵的特征值存在一个显著小的元素,与该特征值相关的特征向量表示所处平面的法线方向。 参考论文:2016,IROS,fast and robust 3d feature extraction from sparse point clouds
。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 int corner_num = 0 ; for (int i = 0 ; i < (int ) pc_in->points.size(); i++) { pcl::PointXYZI point_temp; pointAssociateToMap(&(pc_in->points[i]), &point_temp); std ::vector <int > pointSearchInd; std ::vector <float > pointSearchSqDis; kdtreeEdgeMap->nearestKSearch(point_temp, 5 , pointSearchInd, pointSearchSqDis); if (pointSearchSqDis[4 ] < 1.0 ) { std ::vector <Eigen::Vector3d> nearCorners; Eigen::Vector3d center (0 , 0 , 0 ) ; for (int j = 0 ; j < 5 ; j++) { Eigen::Vector3d tmp (map_in->points[pointSearchInd[j]].x, map_in->points[pointSearchInd[j]].y, map_in->points[pointSearchInd[j]].z) ; center = center + tmp; nearCorners.push_back(tmp); } center = center / 5.0 ; Eigen::Matrix3d covMat = Eigen::Matrix3d::Zero(); for (int j = 0 ; j < 5 ; j++) { Eigen::Matrix<double , 3 , 1 > tmpZeroMean = nearCorners[j] - center; covMat = covMat + tmpZeroMean * tmpZeroMean.transpose(); } Eigen::SelfAdjointEigenSolver<Eigen::Matrix3d> saes(covMat); Eigen::Vector3d unit_direction = saes.eigenvectors().col(2 ); Eigen::Vector3d curr_point (pc_in->points[i].x, pc_in->points[i].y, pc_in->points[i].z) ; if (saes.eigenvalues()[2 ] > 3 * saes.eigenvalues()[1 ]) { Eigen::Vector3d point_on_line = center; Eigen::Vector3d point_a, point_b; point_a = 0.1 * unit_direction + point_on_line; point_b = -0.1 * unit_direction + point_on_line; ceres::CostFunction *cost_function = new EdgeAnalyticCostFunction(curr_point, point_a, point_b); problem.AddResidualBlock(cost_function, loss_function, parameters); corner_num++; } } } if (corner_num < 20 ) { printf ("not enough correct points" ); }
点到平面的约束,通过kdtree获取Local map上最近的5个点,通过QR分解求解最小二乘问题获得平面参数,最后构建点到平面的残差。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 int surf_num = 0 ;for (int i = 0 ; i < (int ) pc_in->points.size(); i++) { pcl::PointXYZI point_temp; pointAssociateToMap(&(pc_in->points[i]), &point_temp); std ::vector <int > pointSearchInd; std ::vector <float > pointSearchSqDis; kdtreeSurfMap->nearestKSearch(point_temp, 5 , pointSearchInd, pointSearchSqDis); Eigen::Matrix<double , 5 , 3 > matA0; Eigen::Matrix<double , 5 , 1 > matB0 = -1 * Eigen::Matrix<double , 5 , 1 >::Ones(); if (pointSearchSqDis[4 ] < 1.0 ) { for (int j = 0 ; j < 5 ; j++) { matA0(j, 0 ) = map_in->points[pointSearchInd[j]].x; matA0(j, 1 ) = map_in->points[pointSearchInd[j]].y; matA0(j, 2 ) = map_in->points[pointSearchInd[j]].z; } Eigen::Vector3d norm = matA0.colPivHouseholderQr().solve(matB0); double negative_OA_dot_norm = 1 / norm.norm(); norm.normalize(); bool planeValid = true ; for (int j = 0 ; j < 5 ; j++) { if (fabs (norm(0 ) * map_in->points[pointSearchInd[j]].x + norm(1 ) * map_in->points[pointSearchInd[j]].y + norm(2 ) * map_in->points[pointSearchInd[j]].z + negative_OA_dot_norm) > 0.2 ) { planeValid = false ; break ; } } Eigen::Vector3d curr_point (pc_in->points[i].x, pc_in->points[i].y, pc_in->points[i].z) ; if (planeValid) { ceres::CostFunction *cost_function = new SurfNormAnalyticCostFunction(curr_point, norm, negative_OA_dot_norm); problem.AddResidualBlock(cost_function, loss_function, parameters); surf_num++; } } } if (surf_num < 20 ) { printf ("not enough correct points" ); }
最后求解优化问题,并更新odom
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 if (laserCloudCornerMap->points.size() > 10 && laserCloudSurfMap->points.size() > 50 ) { kdtreeEdgeMap->setInputCloud(laserCloudCornerMap); kdtreeSurfMap->setInputCloud(laserCloudSurfMap); for (int iterCount = 0 ; iterCount < optimization_count; iterCount++) { ceres::LossFunction *loss_function = new ceres::HuberLoss(0.1 ); ceres::Problem::Options problem_options; ceres::Problem problem (problem_options) ; problem.AddParameterBlock(parameters, 7 , new PoseSE3Parameterization()); addEdgeCostFactor(downsampledEdgeCloud, laserCloudCornerMap, problem, loss_function); addSurfCostFactor(downsampledSurfCloud, laserCloudSurfMap, problem, loss_function); ceres::Solver::Options options; options.linear_solver_type = ceres::DENSE_QR; options.max_num_iterations = 4 ; options.minimizer_progress_to_stdout = false ; options.check_gradients = false ; options.gradient_check_relative_precision = 1e-4 ; ceres::Solver::Summary summary; ceres::Solve(options, &problem, &summary); } } else { printf ("not enough points in map to associate, map error" ); } odom = Eigen::Isometry3d::Identity(); odom.linear() = q_w_curr.toRotationMatrix(); odom.translation() = t_w_curr; addPointsToMap(downsampledEdgeCloud, downsampledSurfCloud);
总结 之后的mapping节点和评价节点就没有什么可以介绍的,略过。整个流程下来,感觉和ALOAM没什么区别,不知道优化体现在哪里?