环境参考:Nvidia Jetson AGX Orin Developer Kit 32GB、Ubuntu20.04、Jetpack5.1.1、Intel Realsense D435i深度相机、ROS-noetic、Vins-Fusion-gpu
Jetson AGX Orin+Vins+ros+d435i配置运行详情请见:
Jetson AGX ORIN配置运行vins-fusion-gpu(Zed/D435)_dueen1123的博客-CSDN博客
Octomap安装
1.编译安装 OctomapServer 建图包
cd ~/catkin_ws/src
git clone https://github.com/OctoMap/octomap_mapping.git
返回主目录,安装依赖,编译
cd ~/catkin_ws
rosdep install octomap_mapping
sudo apt-get install ros-noetic-pcl-ros
sudo apt-get install ros-noetic-octomap-ros
catkin_make
此处若未安装rosdep则执行:
sudo apt update
sudo apt install python3-rosdep
sudo rosdep init
rosdep update
再重新执行上一步。
2.安装 Rviz 可视化 Octomap 的插件:
sudo apt-get install ros-noetic-octomap-rviz-plugins
参数配置
1.修改外参
修改 ~/catkin_ws_src/VINS-Fusion-gpu/config/realsense_d435i下的realsense_stereo_imu_config.yaml文件,最好对相机进行标定
可参考:https://blog.csdn.net/qq_35616298/article/details/116171823
示例:
%YAML:1.0
#common parameters
#support: 1 imu 1 cam; 1 imu 2 cam: 2 cam;
imu: 1
num_of_cam: 2
imu_topic: "/camera/imu"
image0_topic: "/camera/infra1/image_rect_raw"
image1_topic: "/camera/infra2/image_rect_raw"
output_path: "/home/master/output/"
cam0_calib: "left.yaml"
cam1_calib: "right.yaml"
image_width: 640
image_height: 480
# IMU和Camera之间的外部参数。
estimate_extrinsic : 1 # 0 有一个准确的外在参数。我们将信任以下 imu^R_cam,imu^T_cam,不要更改它。
# 1 初步猜测外在参数。我们将围绕您最初的猜测进行优化。
body_T_cam0: !!opencv-matrix
rows: 4
cols: 4
dt: d
data: [ 0.9998528554981037, -0.000802125407182569, 0.00536519139410059, -0.0117530714817712,
0.000756690239208138, 0.999963883387885, 0.00846518397330719, 0.000876053017376294,
-0.00537178776070555, -0.00846099962481987, 0.999949776429598, 0.0330899634954485,
0, 0, 0, 1 ]
body_T_cam1: !!opencv-matrix
rows: 4
cols: 4
dt: d
data: [ 0.9999273228661367, -0.000653939965772498, 0.0037560266870867, 0.0361739507304974,
0.000620172019278801, 0.999959446460512, 0.00898447667104663, 0.000362155753378663,
-0.00376174967527688, -0.00898208199179027, 0.999952584597127, 0.0331025774267971,
0, 0, 0, 1 ]
#多线程支持
multiple_thread: 1
#特征跟踪器参数
max_cnt : 150 #特征跟踪中的最大特征数
min_dist : 30 #两个特征之间的最小距离
freq : 10 #发布跟踪结果的频率 (Hz)。至少 10Hz 以获得良好的估计。如果设置为 0,则频率将与原始图像相同
F_threshold : 1.0 # ransac 阈值(像素)
show_track : 0 # 将跟踪图像发布为主题
flow_back : 1 #执行正向和反向光流以提高特征跟踪精度
#优化参数
max_solver_time : 0.04 #最大求解器迭代时间(毫秒),保证实时
max_num_iterations : 8 #最大求解器迭代次数,以保证实时
keyframe_parallax : 10.0 #关键帧选择阈值(像素)
# imu 参数 你提供的参数越准确,性能越好
acc_n : 0.1 #加速度计测量噪声标准偏差。#0.2 0.04
gyr_n : 0.01 #陀螺仪测量噪声标准偏差。#0.05 0.004
acc_w : 0.001 #加速度计偏差随机工作噪声标准偏差。#0.002
gyr_w : 0.0001 #陀螺仪偏差随机工作噪声标准偏差。#4.0e-5
g_norm : 9.805 #重力大小
#非同步参数
estimate_td : 1 #在线估计相机和imu之间的时间偏移
td : 0.00 #时间偏移的初始值。单位。读取图像时钟 + td = 真实图像时钟(IMU 时钟)
#闭环参数
load_previous_pose_graph : 0 #加载并重用之前的位姿图;从“pose_graph_save_path”加载
pose_graph_save_path : " /home/master/output/pose_graph/ " #保存和加载路径
save_image : 0 # 将图像保存在姿势图中,以便可视化;您可以通过设置0来关闭此功能
2.修改内参
运行相机
roslaunch realsense2_camera rs_camera_orb3.launch #注意改为自己的相机launch文件
查看相机话题找到相机的深度话题
rostopic list
我这里是
输出camera_info
rostopic echo /camera/infra1/camera_info
可以看到参数为K: [385.2993469238281, 0.0, 320.02239990234375, 0.0, 385.2993469238281, 239.9022216796875, 0.0, 0.0, 1.0]
将~/catkin_ws_src/VINS-Fusion-gpu/config/realsense_d435i下的left.yaml以及right.yaml的[fx,fy,cx,cy]修改为得到的内参
例如:
%YAML:1.0
---
model_type: PINHOLE
camera_name: camera
image_width: 640
image_height: 480
distortion_parameters:
k1: 0.0
k2: 0.0
p1: 0.0
p2: 0.0
projection_parameters:
fx: 385.2993469238281
fy: 385.2993469238281
cx: 320.02239990234375
cy: 239.9022216796875
使用Vins-Fusion-gpu建立Octomap
1.设置Octomap Server
启动vins rviz
roslaunch vins vins_rviz.launch
查看rostopic
rostopic list
可以看到/vins_estimator/point_cloud,这是我们所需要的点云话题,但由于需要用pointcloud2转octomap,所以我们需要先把pointcloud转换为pointcloud2
cd ~/catkin_ws/src
git clone https://github.com/1332927388/pcl2octomap.git
cd ~/catkin_ws && catkin_make
打开~/catkin_ws/src/pcl2octomap/launch下的point_cloud_converter.launch
<node pkg="point_cloud_converter" name="point_cloud_converter" type="point_cloud_converter_node" >
<remap from="points_in" to="/vins_estimator/point_cloud"/>
<remap from="points2_out" to="/points" />
</node>
将以上部分与~/catkin_ws/src/octomap_mapping/octomap_server/launch下的octomap_mapping.launch进行组合,如下
<!--
Example launch file for octomap_server mapping:
Listens to incoming PointCloud2 data and incrementally builds an octomap.
The data is sent out in different representations.
Copy this file into your workspace and adjust as needed, see
www.ros.org/wiki/octomap_server for details
-->
<launch>
<node pkg="point_cloud_converter" name="point_cloud_converter" type="point_cloud_converter_node" >
<remap from="points_in" to="/vins_estimator/point_cloud"/>
<remap from="points2_out" to="/points" />
</node>
<node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
<param name="resolution" value="0.05" />
<!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
<param name="frame_id" type="string" value="world" />
<!-- maximum range to integrate (speedup!) -->
<param name="sensor_model/max_range" value="5.0" />
<!-- data source to integrate (PointCloud2) -->
<remap from="cloud_in" to="points" />
</node>
</launch>
注意frame_id要与你的rviz的Fixed Frame一致,可
vim ~/catkin_ws/src/VINS-Fusion-gpu/config/vins_rviz_config.rviz
在vim 输入/Fixed Frame找到,我这里是world
同时data source to integrate改为<remap from="cloud_in" to="points" />
2.rviz建立栅格地图
启动vins rviz
roslaunch vins vins_rviz.launch
连接相机参数
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion-gpu/config/realsense_d435i/realsense_stereo_imu_config.yaml
开启回环检测
rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion-gpu/config/realsense_d435i/realsense_stereo_imu_config.yaml
启动相机
roslaunch realsense2_camera rs_camera_orb3.launch
启动octomap_server
roslaunch octomap_server octomap_mapping.launch
Add OccupancyGrid模块
并将其Octomap topic改为/octomap_full,即可获得栅格地图
3.保存地图
Octomap_server有保存地图的功能
保存压缩的二进制存储格式.bt地图:
rosrun octomap_server octomap_saver mymap.bt
保存一个完整的概率八叉树地图:
rosrun octomap_server octomap_saver -f mymap.ot
以上文件默认保存在你的工作空间catkin_ws中,你也可以输入绝对路径保存,例如~/map_output/mymap.bt文章来源:https://www.toymoban.com/news/detail-802190.html
可安装Octovis可视化工具查看八叉树模型文章来源地址https://www.toymoban.com/news/detail-802190.html
sudo apt-get install octovis
octovis mymap.bt
可在view选项的view mode选项中改为彩色深度图
使用 DenseSurfelMapping建立Octomap
源码地址:GitHub - HKUST-Aerial-Robotics/DenseSurfelMapping at VINS-supported
1.下载源码到工作空间进行编译
cd ~/catkin_ws/src
git clone https://github.com/HKUST-Aerial-Robotics/DenseSurfelMapping/tree/VINS-supported
由于opencv4和opencv3中CV_BGR2GRAY的写法不同,需修改
grep -rl "CV_BGR2GRAY" .|xargs sed -i 's/CV_BGR2GRAY/cv::COLOR_BGR2GRAY/g'
#修改为opencv4的写法,如果用的是opencv3则无需修改
修复pcl功能包出现的错误
gedit ~/catkin_ws/src/VINS-supported/surfel_fusion/CMakeLists.txt
#打开VINS-supported的CMakeLists.txt
打开文件,插入set(CMAKE_CXX_STANDARD 14)
开始编译
cd ~/catkin_ws
catkin_make
修改~/catkin_ws/src/VINS-Supported/surfel_fusion/launch/vins_realsense.launch
将<!-- for data save -->下改为
<remap from="~image" to="/camera/color/image_raw" />
<remap from="~depth" to="/camera/depth/image_rect_raw" />
可参考
<launch>
<node pkg="surfel_fusion" type="surfel_fusion" name="surfel_fusion" clear_params="true" output="screen">
<!-- camera parameter -->
<param name="cam_width" value="640" />
<param name="cam_height" value="480" />
<!--input grey_image info-->
<param name="cam_fx" value="385.2993469238281" />
<param name="cam_fy" value="385.2993469238281" />
<param name="cam_cx" value="320.02239990234375" />
<param name="cam_cy" value="239.9022216796875" />
<!-- fusion parameter, all in meter -->
<param name="fuse_far_distence" value="3.0" />
<param name="fuse_near_distence" value="0.5" />
<!-- for deform the map -->
<param name="drift_free_poses" value="300" />
<!-- for data save -->
#此处改为/camera/color/image_raw和/camera/depth/image_rect_raw输入
<!-- <remap from="~image" to="/camera/infra1/image_rect_raw" />-->
<remap from="~image" to="/camera/color/image_raw" />
<remap from="~depth" to="/camera/depth/image_rect_raw" />
<!-- <remap from="~depth" to="/camera/aligned_depth_to_color/image_raw" />-->
<remap from="~loop_path" to="/loop_fusion/pose_graph_path" />
<remap from="~extrinsic_pose" to="/vins_estimator/extrinsic" />
<param name="save_name" value="$(find surfel_fusion)/../../../output_map"/>
</node>
<node type="rviz" name="rviz" pkg="rviz" args="-d $(find surfel_fusion)/launch/surfel.rviz" />
</launch>
同时修改相机启动文件,开启color和depth的话题,在~/catkin_ws/src/realsense-ros-ros1-legacy/realsense2_camera/launch下,我这里是rs_camera_orb3.launch
可参考
<launch>
<arg name="serial_no" default=""/>
<arg name="usb_port_id" default=""/>
<arg name="device_type" default=""/>
<arg name="json_file_path" default=""/>
<arg name="camera" default="camera"/>
<arg name="tf_prefix" default="$(arg camera)"/>
<arg name="external_manager" default="false"/>
<arg name="manager" default="realsense2_camera_manager"/>
<arg name="fisheye_width" default="640"/>
<arg name="fisheye_height" default="480"/>
<arg name="enable_fisheye" default="false"/>
<arg name="depth_width" default="640"/>
<arg name="depth_height" default="480"/>
<arg name="enable_depth" default="true"/>#此处打开
<arg name="infra_width" default="640"/>
<arg name="infra_height" default="480"/>
<arg name="enable_infra1" default="true"/>
<arg name="enable_infra2" default="true"/>
<arg name="color_width" default="640"/>
<arg name="color_height" default="480"/>
<arg name="enable_color" default="true"/>#此处打开
<arg name="fisheye_fps" default="30"/>
<arg name="depth_fps" default="30"/>
<arg name="infra_fps" default="30"/>
<arg name="color_fps" default="30"/>
<arg name="gyro_fps" default="200"/>
<arg name="accel_fps" default="250"/>
<arg name="enable_gyro" default="true"/>
<arg name="enable_accel" default="true"/>
<arg name="enable_pointcloud" default="false"/>
<arg name="pointcloud_texture_stream" default="RS2_STREAM_COLOR"/>
<arg name="pointcloud_texture_index" default="0"/>
<arg name="enable_sync" default="true"/>
<arg name="align_depth" default="false"/>#关闭了变得稳定很多
<arg name="publish_tf" default="true"/>
<arg name="tf_publish_rate" default="0"/>
<arg name="filters" default=""/>
<arg name="clip_distance" default="6"/>
<arg name="linear_accel_cov" default="0.01"/>
<arg name="initial_reset" default="true"/>
<arg name="unite_imu_method" default="linear_interpolation"/>
<arg name="topic_odom_in" default="odom_in"/>
<arg name="calib_odom_file" default=""/>
<arg name="publish_odom_tf" default="true"/>
<arg name="allow_no_texture_points" default="false"/>
<arg name="emitter_enable" default="false"/>
<!-- rosparam set /camera/stereo_module/emitter_enabled false~/catkin_ws/src/realsense-ros/calib_1-->
<rosparam>
/camera/stereo_module/emitter_enabled: 0
</rosparam>
<rosparam if="$(arg emitter_enable)">
/camera/stereo_module/emitter_enabled: 1
</rosparam>
<group ns="$(arg camera)">
<include file="$(find realsense2_camera)/launch/includes/nodelet.launch.xml">
<arg name="tf_prefix" value="$(arg tf_prefix)"/>
<arg name="external_manager" value="$(arg external_manager)"/>
<arg name="manager" value="$(arg manager)"/>
<arg name="serial_no" value="$(arg serial_no)"/>
<arg name="usb_port_id" value="$(arg usb_port_id)"/>
<arg name="device_type" value="$(arg device_type)"/>
<arg name="json_file_path" value="$(arg json_file_path)"/>
<arg name="enable_pointcloud" value="$(arg enable_pointcloud)"/>
<arg name="pointcloud_texture_stream" value="$(arg pointcloud_texture_stream)"/>
<arg name="pointcloud_texture_index" value="$(arg pointcloud_texture_index)"/>
<arg name="enable_sync" value="$(arg enable_sync)"/>
<arg name="align_depth" value="$(arg align_depth)"/>
<arg name="fisheye_width" value="$(arg fisheye_width)"/>
<arg name="fisheye_height" value="$(arg fisheye_height)"/>
<arg name="enable_fisheye" value="$(arg enable_fisheye)"/>
<arg name="depth_width" value="$(arg depth_width)"/>
<arg name="depth_height" value="$(arg depth_height)"/>
<arg name="enable_depth" value="$(arg enable_depth)"/>
<arg name="color_width" value="$(arg color_width)"/>
<arg name="color_height" value="$(arg color_height)"/>
<arg name="enable_color" value="$(arg enable_color)"/>
<arg name="infra_width" value="$(arg infra_width)"/>
<arg name="infra_height" value="$(arg infra_height)"/>
<arg name="enable_infra1" value="$(arg enable_infra1)"/>
<arg name="enable_infra2" value="$(arg enable_infra2)"/>
<arg name="fisheye_fps" value="$(arg fisheye_fps)"/>
<arg name="depth_fps" value="$(arg depth_fps)"/>
<arg name="infra_fps" value="$(arg infra_fps)"/>
<arg name="color_fps" value="$(arg color_fps)"/>
<arg name="gyro_fps" value="$(arg gyro_fps)"/>
<arg name="accel_fps" value="$(arg accel_fps)"/>
<arg name="enable_gyro" value="$(arg enable_gyro)"/>
<arg name="enable_accel" value="$(arg enable_accel)"/>
<arg name="publish_tf" value="$(arg publish_tf)"/>
<arg name="tf_publish_rate" value="$(arg tf_publish_rate)"/>
<arg name="filters" value="$(arg filters)"/>
<arg name="clip_distance" value="$(arg clip_distance)"/>
<arg name="linear_accel_cov" value="$(arg linear_accel_cov)"/>
<arg name="initial_reset" value="$(arg initial_reset)"/>
<arg name="unite_imu_method" value="$(arg unite_imu_method)"/>
<arg name="topic_odom_in" value="$(arg topic_odom_in)"/>
<arg name="calib_odom_file" value="$(arg calib_odom_file)"/>
<arg name="publish_odom_tf" value="$(arg publish_odom_tf)"/>
<arg name="allow_no_texture_points" value="$(arg allow_no_texture_points)"/>
</include>
</group>
</launch>
2.启动 Surfel Fusion生成点云
启动 Surfel Fusion
roslaunch surfel_fusion vins_realsense.launch
链接D435i启动vins连接相机
roslaunch realsense2_camera rs_camera_orb3.launch
连接相机参数
rosrun vins vins_node ~/catkin_ws/src/VINS-Fusion-gpu/config/realsense_d435i/realsense_stereo_imu_config.yaml
开启回环检测
rosrun loop_fusion loop_fusion_node ~/catkin_ws/src/VINS-Fusion-gpu/config/realsense_d435i/realsense_stereo_imu_config.yaml
3.启动 Octomap生成栅格地图
修改编辑:~/catkin_ws/src/octomap_mapping/octomap_server/launch中的octomap_mapping.launch中的<!-- data source to integrate (PointCloud2) -->,改为<remap from="cloud_in" to="surfel_fusion/rgb_pointcloud" />,如下:
<!--
Example launch file for octomap_server mapping:
Listens to incoming PointCloud2 data and incrementally builds an octomap.
The data is sent out in different representations.
Copy this file into your workspace and adjust as needed, see
www.ros.org/wiki/octomap_server for details
-->
<launch>
<node pkg="point_cloud_converter" name="point_cloud_converter" type="point_cloud_converter_node" >
<remap from="points_in" to="/vins_estimator/point_cloud"/>
<remap from="points2_out" to="/points" />
<!--
<remap from="points2_in" to="velodyne_points"/>
<remap from="points_out" to="velodyne_points" />
-->
</node>
<node pkg="octomap_server" type="octomap_server_node" name="octomap_server">
<param name="resolution" value="0.05" />
<!-- fixed map frame (set to 'map' if SLAM or localization running!) -->
<param name="frame_id" type="string" value="world" />
<!-- maximum range to integrate (speedup!) -->
<param name="sensor_model/max_range" value="5.0" />
<!-- data source to integrate (PointCloud2) -->
<remap from="cloud_in" to="surfel_fusion/rgb_pointcloud" />
</node>
</launch>
生成rgbcloud点云后启动octomap_server
roslaunch octomap_server octomap_mapping.launch
启动后在surfel fusion选项中添加OccupancyGrid模块,设置 topic 为/octomap_full或/octomap_binary得到栅格地图
Octomap_server有保存地图的功能
保存压缩的二进制存储格式.bt地图:
rosrun octomap_server octomap_saver mymap.bt
保存一个完整的概率八叉树地图:
rosrun octomap_server octomap_saver -f mymap.ot
以上文件默认保存在你的工作空间catkin_ws中,你也可以输入绝对路径保存,例如~/map_output/mymap.bt
可安装Octovis可视化工具查看八叉树模型
sudo apt-get install octovis
octovis mymap.bt
到了这里,关于Jetson AGX Orin+d435i相机实现Octomap建图的文章就介绍完了。如果您还想了解更多内容,请在右上角搜索TOY模板网以前的文章或继续浏览下面的相关文章,希望大家以后多多支持TOY模板网!