@@ -60,10 +60,35 @@ bounding box. Experiments show that Faster R-CNN with
ResNet101 as feature extractor outperforms the other algo-
rithms.
# An Efficient Multi-sensor Fusion Approach for Object Detection in Maritime Environmentt [6]:
Robust real-time object detection and tracking are
challenging problems in autonomous transportation systems
due to operation of algorithms in inherently uncertain and
dynamic environments and rapid movement of objects. There-
fore, tracking and detection algorithms must cooperate with
each other to achieve smooth tracking of detected objects that
later can be used by the navigation system. In this paper, we
first present an efficient multi-sensor fusion approach based on
the probabilistic data association method in order to achieve
accurate object detection and tracking results. The proposed ap-
proach fuses the detection results obtained independently from
four main sensors: radar, LiDAR, RGB camera and infrared
camera. It generates object region proposals based on the fused
detection result. Then, a Convolutional Neural Network (CNN)
approach is used to identify the object categories within these
regions. The CNN is trained on a real dataset from different
ferry driving scenarios. The experimental results of tracking
and classification on real datasets show that the proposed
approach provides reliable object detection and classification
results in maritime environments.
# References
1. F. Farahnakian, M.Haghbayan, J. Poikonen, M. Laurinen, P. Nevalainen and J. Heikkonen, “Object Detection based on Multi-sensor Proposal Fusion in Maritime Environment”, The 17th IEEE International Conference on Machine Learning and Applications (ICMLA), 2018, US.
2. F. Farahnakian, P. Movahedi , J. Poikonen, E. Lehtonen, D. Makris and J. Heikkonen, “Comparative Analysis of Image Fusion Methods in Marine Environment”, Proceedings of the 13th edition of the IEEE International Symposium on RObotic and Sensors Environments (ROSE), 2019, Canada.
3. F. Farahnakian, J. Poikonen, M. Laurinen, D. Makris and J. Heikkonen, “Visible and Infrared Image Fusion Framework based on RetinaNet for Marine Environment”, The 22th edition of the IEEE International conference on information fusion (Fusion), 2019, Canada.
4. F. Farahnakian, J. Poikonen, M. Laurinen, and J. Heikkonen, “Deep Convolutional Neural Network-based Fusion of RGB and IR Images in Marine Environment”, The 22th IEEE International Conference on Intelligent Transportation Systems (ITSC), 2019, New Zealand.
5. V. Soloviev, F. Farahnakian, L. Zelioli, B. Iancu, J. Lilius and J. Heikkonen, "Comparing CNN-Based Object Detectors on Two Novel Maritime Datasets," 2020 IEEE International Conference on Multimedia & Expo Workshops (ICMEW), London, United Kingdom, 2020, pp. 1-6, doi: 10.1109/ICMEW46912.2020.9106019.
6. M.Haghbayan, F. Farahnakian, J. Poikonen, M. Laurinen, P. Nevalainen and J. Heikkonen, “An Efficient
Multi-sensor Fusion Approach for Object Detection in Maritime Environment”, The 21 th IEEE International
Conference on Intelligent Transportation Systems (ITSC), 2018, USA.