Fusion of LiDAR and Camera
We used
- VLP 16 LiDAR
- Logitech stream camera
There is release version of Fusion code. Paste it in /catking_ws/src/.
You can change the hres and vres in launch file.
<launch>
<node name="fusion_car_tracking" pkg="fusion_car_tracking" type="fusion_car_tracking" output="screen" respawn="true">
<param name="hres" value="0.07480" />
<param name="vres" value="0.101018" />
<param name="y_fudge" value="120" />
<param name="y_fudge2" value="-190" />
<param name="map" value="0"/>
</node>
</launch>
in catking_ws
catkin_make -DCMAKE_BUILD_TYPE=Release
source devel/setup.bash
rosluanch fusion_car_tracking fusion.launch
Before run the fusion code, please check the yolov3 and lidar tracking code.
Official Tracking_msg viewer will be providedEstimation of Closest In-Path Vehicle (CIPV) by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicle
@misc{bae2021estimation,
title={Estimation of Closest In-Path Vehicle (CIPV) by Low-Channel LiDAR and Camera Sensor Fusion for Autonomous Vehicle},
author={Hyunjin Bae and Gu Lee and Jaeseung Yang and Gwanjun Shin and Yongseob Lim and Gyeungho Choi},
year={2021},
eprint={2103.13952},
archivePrefix={arXiv},
primaryClass={cs.RO}
}