开源SLAM系统:FLOAM源码解析

最近在github上看到了此套系统,看简介说是基于LOAM和ALOAM改进而来,计算时间缩小3倍,精度也有一定提高。本系统只是小提升,并没有相关论文发表。作者加入了一些场景识别的改进,发表在2020年ICRA上,叫ISCLOAM,代码也开源到github,后续单独写一篇进行记录。

image-20200817192800526

我在kitti上跑了一下:

红色的表示ground_truth,绿色的是odom轨迹。

image-20200817193418905

项目地址如下:https://github.com/wh200720041/floam

我的笔记注释如下:https://github.com/JokerJohn/opensource_slam_noted

简介

​ FLOAM是基于LOAM和ALOAM的修改版,其主要的原理和流程没有变化。首先是点云预处理、然后提取边缘和平面特征,分别进行匹配估计位姿,最后位姿融合。当然这里还提供了mapping和localization的模块,但mapping部分只是将估计的位姿,拼接点云,并没有后端优化等模块,所以不是关键。下面将从代码层面来进行记录。

image-20200817193839887

代码结构

从代码结构来看,主要有以下cpp文件

  • lidar.cpp: 系统的参数配置,比如雷达线束数量、扫描周期等
  • laserProcessingClass.cpp、laserProcessingNode.cpp:点云预处理节点、包括边缘、平面特征提取
  • odomEstimationNode.cpp、odomEstimationClass.cpp:位姿估计节点,根据边缘点、平面点特征,构建点线、点面残差,利用ceres联合滑窗优化,估计位姿。
  • laserMappingNode.cpp、laserMappingClass.cpp:建图节点,根据odom信息拼接点云
  • lidarOptimization.cpp:odom评价

特征提取

这里主要订阅了点云数据,进行初步过滤,并提取边缘和平面特征,其中callback中只是用一个queue队列来装相关数据,所有的业务逻辑用单独的一个线程来处理。

1
2
3
4
5
6
7
8
9
10
11
// callback里面不写逻辑,只装buffer数据
ros::Subscriber subLaserCloud = nh.subscribe<sensor_msgs::PointCloud2>("/velodyne_points", 100, velodyneHandler);

pubLaserCloudFiltered = nh.advertise<sensor_msgs::PointCloud2>("/velodyne_points_filtered", 100);

pubEdgePoints = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_edge", 100);

pubSurfPoints = nh.advertise<sensor_msgs::PointCloud2>("/laser_cloud_surf", 100);

// 在process线程里面处理点云数据
std::thread laser_processing_process{laser_processing};

laser_processing_process线程中循环执行,取点云数据->提取边缘和平面特征->发布相关点云。首先是取点云数据

1
2
3
4
5
6
7
8
//read data
mutex_lock.lock();
pcl::PointCloud<pcl::PointXYZI>::Ptr pointcloud_in(new pcl::PointCloud<pcl::PointXYZI>());
pcl::fromROSMsg(*pointCloudBuf.front(), *pointcloud_in);
ros::Time pointcloud_time = (pointCloudBuf.front())->header.stamp;
pointCloudBuf.pop(); // pop掉
mutex_lock.unlock();
// 取queue里面的第一条数据

特征提取,首先计算点云线束并分类,计算曲率,并将点云划分6个子图,每个子图分别提取特征,保证各方向特征提取均匀。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
int N_SCANS = lidar_param.num_lines;
std::vector<pcl::PointCloud<pcl::PointXYZI>::Ptr> laserCloudScans; // 将点云按照线束分类,装在这个里面
for (int i = 0; i < N_SCANS; i++) {
laserCloudScans.push_back(pcl::PointCloud<pcl::PointXYZI>::Ptr(new pcl::PointCloud<pcl::PointXYZI>()));
}
// 遍历点云,给点云的每个点进行分类,具体属于哪个线束
for (int i = 0; i < (int) pc_in->points.size(); i++) {
int scanID = 0;
// 比较远或者过近的点去掉
double distance = sqrt(pc_in->points[i].x * pc_in->points[i].x + pc_in->points[i].y * pc_in->points[i].y);
if (distance < lidar_param.min_distance || distance > lidar_param.max_distance)
continue;
// 计算点在垂直方向的角度,用来计算属于哪个线束
double angle = atan(pc_in->points[i].z / distance) * 180 / M_PI;

// 根据激光雷达的线束进行点的分类
if (N_SCANS == 16) { // 16线两线束间间隔30度,从-15到+15共16根线
scanID = int((angle + 15) / 2 + 0.5); // 这里为了避免出现负数,统一加15度
if (scanID > (N_SCANS - 1) || scanID < 0) {
continue;
}
} else if (N_SCANS == 32) {
scanID = int((angle + 92.0 / 3.0) * 3.0 / 4.0);
if (scanID > (N_SCANS - 1) || scanID < 0) {
continue;
}
} else if (N_SCANS == 64) {
if (angle >= -8.83)
scanID = int((2 - angle) * 3.0 + 0.5);
else
scanID = N_SCANS / 2 + int((-8.83 - angle) * 2.0 + 0.5);

if (angle > 2 || angle < -24.33 || scanID > 63 || scanID < 0) {
continue;
}
} else {
printf("wrong scan number\n");
}
laserCloudScans[scanID]->push_back(pc_in->points[i]); // 将每个点装入不同的容器
}

计算每个点的曲率,参考LOAM

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
// 按点云线束进行遍历
for (int i = 0; i < N_SCANS; i++) {
if (laserCloudScans[i]->points.size() < 131) { // 某一束点云少于131则不适用
continue;
}
// 计算每个点的曲率(粗糙度),参考LOAM
std::vector<Double2d> cloudCurvature;
int total_points = laserCloudScans[i]->points.size() - 10;
// 当前点前后各五个点,计算其三个方向上距离和的平方
for (int j = 5; j < (int) laserCloudScans[i]->points.size() - 5; j++) {
double diffX = laserCloudScans[i]->points[j - 5].x + laserCloudScans[i]->points[j - 4].x
+ laserCloudScans[i]->points[j - 3].x + laserCloudScans[i]->points[j - 2].x
+ laserCloudScans[i]->points[j - 1].x - 10 * laserCloudScans[i]->points[j].x
+ laserCloudScans[i]->points[j + 1].x + laserCloudScans[i]->points[j + 2].x
+ laserCloudScans[i]->points[j + 3].x + laserCloudScans[i]->points[j + 4].x
+ laserCloudScans[i]->points[j + 5].x;
double diffY = laserCloudScans[i]->points[j - 5].y + laserCloudScans[i]->points[j - 4].y
+ laserCloudScans[i]->points[j - 3].y + laserCloudScans[i]->points[j - 2].y
+ laserCloudScans[i]->points[j - 1].y - 10 * laserCloudScans[i]->points[j].y
+ laserCloudScans[i]->points[j + 1].y + laserCloudScans[i]->points[j + 2].y
+ laserCloudScans[i]->points[j + 3].y + laserCloudScans[i]->points[j + 4].y
+ laserCloudScans[i]->points[j + 5].y;
double diffZ = laserCloudScans[i]->points[j - 5].z + laserCloudScans[i]->points[j - 4].z
+ laserCloudScans[i]->points[j - 3].z + laserCloudScans[i]->points[j - 2].z
+ laserCloudScans[i]->points[j - 1].z - 10 * laserCloudScans[i]->points[j].z
+ laserCloudScans[i]->points[j + 1].z + laserCloudScans[i]->points[j + 2].z
+ laserCloudScans[i]->points[j + 3].z + laserCloudScans[i]->points[j + 4].z
+ laserCloudScans[i]->points[j + 5].z;
Double2d distance(j, diffX * diffX + diffY * diffY + diffZ * diffZ);
cloudCurvature.push_back(distance);

}

点云均匀划分

1
2
3
4
5
6
7
8
9
10
11
12
13
// 将点云均匀划分成6个子图,保证各方向都能提取到特征
for (int j = 0; j < 6; j++) {
int sector_length = (int) (total_points / 6);
int sector_start = sector_length * j;
int sector_end = sector_length * (j + 1) - 1;
if (j == 5) {
sector_end = total_points - 1;
}
std::vector<Double2d>
subCloudCurvature(cloudCurvature.begin() + sector_start, cloudCurvature.begin() + sector_end);
// 特征提取在这个里面进行
featureExtractionFromSector(laserCloudScans[i], subCloudCurvature, pc_out_edge, pc_out_surf);
}

提取特征

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
// 将点云按曲率大小进行排序
std::sort(cloudCurvature.begin(), cloudCurvature.end(), [](const Double2d &a, const Double2d &b) {
return a.value < b.value;
});

int largestPickedNum = 0;
std::vector<int> picked_points;
int point_info_count = 0;
// 遍历点云,根据曲率判断其是平面点还是边缘点
for (int i = cloudCurvature.size() - 1; i >= 0; i--) {
int ind = cloudCurvature[i].id;
if (std::find(picked_points.begin(), picked_points.end(), ind) == picked_points.end()) {
if (cloudCurvature[i].value <= 0.1) { // 曲率太小的,去除
break;
}

largestPickedNum++;
picked_points.push_back(ind); // 找到曲率最大的20个点

if (largestPickedNum <= 20) {
pc_out_edge->push_back(pc_in->points[ind]); // 认为是边缘点
point_info_count++;
} else {
break;
}

// 计算此点前后5个点的(曲率?)
for (int k = 1; k <= 5; k++) {
double diffX = pc_in->points[ind + k].x - pc_in->points[ind + k - 1].x;
double diffY = pc_in->points[ind + k].y - pc_in->points[ind + k - 1].y;
double diffZ = pc_in->points[ind + k].z - pc_in->points[ind + k - 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05) {
break;
}
picked_points.push_back(ind + k);
}
for (int k = -1; k >= -5; k--) {
double diffX = pc_in->points[ind + k].x - pc_in->points[ind + k + 1].x;
double diffY = pc_in->points[ind + k].y - pc_in->points[ind + k + 1].y;
double diffZ = pc_in->points[ind + k].z - pc_in->points[ind + k + 1].z;
if (diffX * diffX + diffY * diffY + diffZ * diffZ > 0.05) {
break;
}
picked_points.push_back(ind + k);
}
}
}

// 平面点
for (int i = 0; i <= (int) cloudCurvature.size() - 1; i++) {
int ind = cloudCurvature[i].id;
if (std::find(picked_points.begin(), picked_points.end(), ind) == picked_points.end()) {
pc_out_surf->push_back(pc_in->points[ind]);
}
}

位姿估计

位姿估计节点主要接收边缘和平面特征,分别构建点到直线,点到平面的残差项,并通过ceres求解旋转四元数q和位移t。说实话,我没看出来和LOAM/ALOAM有啥区别。

首先是时间戳对齐,去掉较老时间戳的点云。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
// 注意点云的时间戳同步, 0.5*scan_period时间内的认为是一个位置的点云,不合条件的pop掉
mutex_lock.lock();
if(!pointCloudBuf.empty() && (pointCloudBuf.front()->header.stamp.toSec()<pointCloudSurfBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period || pointCloudBuf.front()->header.stamp.toSec()<pointCloudEdgeBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period)){
ROS_WARN("time stamp unaligned error and odom discarded, pls check your data --> odom correction");
pointCloudBuf.pop();
mutex_lock.unlock();
continue;
}

if(!pointCloudSurfBuf.empty() && (pointCloudSurfBuf.front()->header.stamp.toSec()<pointCloudBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period || pointCloudSurfBuf.front()->header.stamp.toSec()<pointCloudEdgeBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period)){
pointCloudSurfBuf.pop();
ROS_INFO("time stamp unaligned with extra point cloud, pls check your data --> odom correction");
mutex_lock.unlock();
continue;
}

if(!pointCloudEdgeBuf.empty() && (pointCloudEdgeBuf.front()->header.stamp.toSec()<pointCloudBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period || pointCloudEdgeBuf.front()->header.stamp.toSec()<pointCloudSurfBuf.front()->header.stamp.toSec()-0.5*lidar_param.scan_period)){
pointCloudEdgeBuf.pop();
ROS_INFO("time stamp unaligned with extra point cloud, pls check your data --> odom correction");
mutex_lock.unlock();
continue;
}

然后用一个滑窗来维护corner和surface特征的local map,第一帧点云直接加入local map进行初始化。后续点云帧进来就构建约束。主流程如下

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
// 第一帧直接加入map,否则更新map
if(is_odom_inited == false){
// 这里维持了12个窗口的滑窗,分别将当前特征点云加入到相应的特征点局部地图中
odomEstimation.initMapWithPoints(pointcloud_edge_in, pointcloud_surf_in);
is_odom_inited = true;
ROS_INFO("odom inited");
}else{
std::chrono::time_point<std::chrono::system_clock> start, end;
start = std::chrono::system_clock::now();
// 构建优化问题并求解、更新odom
odomEstimation.updatePointsToMap(pointcloud_edge_in, pointcloud_surf_in);
end = std::chrono::system_clock::now();
std::chrono::duration<float> elapsed_seconds = end - start;
total_frame++;
float time_temp = elapsed_seconds.count() * 1000;
total_time+=time_temp;
ROS_INFO("average odom estimation time %f ms \n \n", total_time/total_frame);
}

构建点到直线的约束。寻找最近的5个点,对点云协方差矩阵进行主成份分析: 若这5个点分布在直线上,协方差矩阵的特征值包含一个元素显著大于其余两个,与该特征值相关的特征向量表示所处直线的方向; 若这5个点分布在平面上,协方差矩阵的特征值存在一个显著小的元素,与该特征值相关的特征向量表示所处平面的法线方向。 参考论文:2016,IROS,fast and robust 3d feature extraction from sparse point clouds

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
int corner_num = 0;
for (int i = 0; i < (int) pc_in->points.size(); i++) {
pcl::PointXYZI point_temp;
pointAssociateToMap(&(pc_in->points[i]), &point_temp); // 将当前点转换到地图坐标系

// 寻找最近邻的5个点
std::vector<int> pointSearchInd;
std::vector<float> pointSearchSqDis;
kdtreeEdgeMap->nearestKSearch(point_temp, 5, pointSearchInd, pointSearchSqDis);
if (pointSearchSqDis[4] < 1.0) {
std::vector<Eigen::Vector3d> nearCorners;
// 计算均值
Eigen::Vector3d center(0, 0, 0);
for (int j = 0; j < 5; j++) {
Eigen::Vector3d tmp(map_in->points[pointSearchInd[j]].x,
map_in->points[pointSearchInd[j]].y,
map_in->points[pointSearchInd[j]].z);
center = center + tmp;
nearCorners.push_back(tmp);
}
center = center / 5.0;
// 计算协方差矩阵
Eigen::Matrix3d covMat = Eigen::Matrix3d::Zero();
for (int j = 0; j < 5; j++) {
Eigen::Matrix<double, 3, 1> tmpZeroMean = nearCorners[j] - center;
covMat = covMat + tmpZeroMean * tmpZeroMean.transpose();
}

// 主成分分析,获取直线的参数
Eigen::SelfAdjointEigenSolver<Eigen::Matrix3d> saes(covMat);

Eigen::Vector3d unit_direction = saes.eigenvectors().col(2);
Eigen::Vector3d curr_point(pc_in->points[i].x, pc_in->points[i].y, pc_in->points[i].z);
if (saes.eigenvalues()[2] > 3 * saes.eigenvalues()[1]) { // 如果最大的特征向量,明显比第二大的大,则认为是直线
Eigen::Vector3d point_on_line = center; // 取均值作为该直线的一点
Eigen::Vector3d point_a, point_b;
// 取直线的两点,中点+-0.1×(直线方向单位向量)
point_a = 0.1 * unit_direction + point_on_line;
point_b = -0.1 * unit_direction + point_on_line;

// 用点O,A,B构造点到线的距离的残差项,注意这三个点都是在上一帧的Lidar坐标系下,即,残差 = 点O到直线AB的距离
ceres::CostFunction *cost_function = new EdgeAnalyticCostFunction(curr_point, point_a, point_b);
// 添加边缘点关联构建的残差项
problem.AddResidualBlock(cost_function, loss_function, parameters);
corner_num++;
}
}
}
if (corner_num < 20) {
printf("not enough correct points");
}

点到平面的约束,通过kdtree获取Local map上最近的5个点,通过QR分解求解最小二乘问题获得平面参数,最后构建点到平面的残差。

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
int surf_num = 0;
for (int i = 0; i < (int) pc_in->points.size(); i++) {
pcl::PointXYZI point_temp;
pointAssociateToMap(&(pc_in->points[i]), &point_temp);
std::vector<int> pointSearchInd;
std::vector<float> pointSearchSqDis;
kdtreeSurfMap->nearestKSearch(point_temp, 5, pointSearchInd, pointSearchSqDis);

// 与上面的建立corner特征点之间的关联类似,寻找平面特征点O的最近邻点ABC,
// 即基于最近邻原理建立surf特征点之间的关联,find correspondence for plane features
// 寻找五个紧邻点
Eigen::Matrix<double, 5, 3> matA0;
Eigen::Matrix<double, 5, 1> matB0 = -1 * Eigen::Matrix<double, 5, 1>::Ones();
if (pointSearchSqDis[4] < 1.0) {

// 5*3的矩阵
for (int j = 0; j < 5; j++) {
matA0(j, 0) = map_in->points[pointSearchInd[j]].x;
matA0(j, 1) = map_in->points[pointSearchInd[j]].y;
matA0(j, 2) = map_in->points[pointSearchInd[j]].z;
}
// 计算平面的法向量
// 平面方程Ax+By+Cz+D = 0 => (x,y,z)(A/D,B/D,C/D)^T = -1 => 求解 Ax = b ,x即为法向量
Eigen::Vector3d norm = matA0.colPivHouseholderQr().solve(matB0);
double negative_OA_dot_norm = 1 / norm.norm();
norm.normalize();

// 判断平面是否有效
// Here n(pa, pb, pc) is unit norm of plane X^T*n = -1 => X^T*n/|n| + 1/|n| = 0 如果结果>0.2则认为有误
bool planeValid = true;
for (int j = 0; j < 5; j++) {
// if OX * n > 0.2, then plane is not fit well
if (fabs(norm(0) * map_in->points[pointSearchInd[j]].x +
norm(1) * map_in->points[pointSearchInd[j]].y +
norm(2) * map_in->points[pointSearchInd[j]].z + negative_OA_dot_norm) > 0.2) {
planeValid = false;
break;
}
}
Eigen::Vector3d curr_point(pc_in->points[i].x, pc_in->points[i].y, pc_in->points[i].z);
if (planeValid) {
// 有效的话t添加点到平面的残差项
// 用点O,A,B,C构造点到面的距离的残差项,注意这三个点都是在上一帧的Lidar坐标系下,即,残差 = 点O到平面ABC的距离
ceres::CostFunction *cost_function = new SurfNormAnalyticCostFunction(curr_point, norm, negative_OA_dot_norm);
problem.AddResidualBlock(cost_function, loss_function, parameters);

surf_num++;
}
}

}
if (surf_num < 20) {
printf("not enough correct points");
}

最后求解优化问题,并更新odom

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
if (laserCloudCornerMap->points.size() > 10 && laserCloudSurfMap->points.size() > 50) {
kdtreeEdgeMap->setInputCloud(laserCloudCornerMap);
kdtreeSurfMap->setInputCloud(laserCloudSurfMap);

// ceres优化,附近的多帧点云构建优化问题
for (int iterCount = 0; iterCount < optimization_count; iterCount++) {
ceres::LossFunction *loss_function = new ceres::HuberLoss(0.1); // 鲁棒核
ceres::Problem::Options problem_options;
ceres::Problem problem(problem_options); // 优化问题

// 残差块
problem.AddParameterBlock(parameters, 7, new PoseSE3Parameterization());

// 对两种特征点云分别构建点到平面、点到边缘的残差项
addEdgeCostFactor(downsampledEdgeCloud, laserCloudCornerMap, problem, loss_function);
addSurfCostFactor(downsampledSurfCloud, laserCloudSurfMap, problem, loss_function);

ceres::Solver::Options options;
options.linear_solver_type = ceres::DENSE_QR;
options.max_num_iterations = 4;
options.minimizer_progress_to_stdout = false;
options.check_gradients = false;
options.gradient_check_relative_precision = 1e-4;
ceres::Solver::Summary summary;

// 求解问题, 优化完成后的结果更新在 q_w_curr、t_w_curr
ceres::Solve(options, &problem, &summary);
}
} else {
printf("not enough points in map to associate, map error");
}

// odom是全局变量
odom = Eigen::Isometry3d::Identity();
odom.linear() = q_w_curr.toRotationMatrix();
odom.translation() = t_w_curr;
addPointsToMap(downsampledEdgeCloud, downsampledSurfCloud); // 加到地图中

总结

之后的mapping节点和评价节点就没有什么可以介绍的,略过。整个流程下来,感觉和ALOAM没什么区别,不知道优化体现在哪里?

-------------    本文结束  感谢您的阅读    -------------
胡想成 wechat
欢迎关注个人公众号!
您的支持,是我装逼的最大动力!

本文标题:开源SLAM系统:FLOAM源码解析

文章作者:胡想成

发布时间:2020年08月17日 - 22:08

最后更新:2020年08月18日 - 12:08

原始链接:xchu.net/2020/08/17/49floam/

版权声明:本博客所有文章除特别声明外,均采用 CC BY-NC-SA 4.0许可协议,转载请注明出处!