Perception technology for conversion of off-road vehicles for the purposes of unmanned missions




lidar, navigation, unmanned vehicles, perception system, obstacle detection


Autonomous ground vehicles (AGV) have great potential for a wide range of applications, both in the civilian and military fields. Nowadays there is an interest in converting existing off-road vehicles into remotely controlled platforms with autonomous mode due to several benefits, including reducing the risk to human life, increasing efficiency and accuracy, and allowing the vehicles to operate in hazardous zones. Environmental perception technology plays a critical role in enabling the safe and effective operation during unmanned missions. This technology involves sensors, cameras, and other devices to gather information about the environment and provide the unmanned ground platform (UGV) with a perception of its surroundings. In recent years, there has been significant progress in the development of environmental perception systems, including the use of advanced sensors, machine learning algorithms, and other innovations have become a focus of research and development for many countries. This paper describes a combination of commercially available vision sensors, laser scanners and navigation modules for comprehensive understanding of operational environment, orientation, objects recognition during autonomous mode. Typical methods for vision and lidar-based obstacle detection and object classification for unmanned vehicles are described. The aim of the work was to examine in real environment the performance of a perception system that was configured using a daylight-thermal observation and stereo cameras, lidar sensor, GNSS module, radio links along with computing units. This system was evaluated in terms of performance of different sensors considering implementation for all terrain vehicle as subsystem for unmanned mode.


Ahmadi K.D., Rashidi A.J., Moghri A.M. (2022) Design and simulation of autonomous military vehicle control system based on machine vision and ensemble movement approach. J Supercomput, 78.

Alaba S., Gurbuz A., Ball J.A. (2022) Comprehensive Survey of Deep Learning Multisensor Fusion-based 3D Object Detection for Autonomous Driving: Methods, Challenges, Open Issues, and Future Directions. TechRxiv. Preprint.

Andersson C. A. (2022) The unmanned ground vehicles to be used in future military operations. Tiede Ja Ase, 2021(79). Noudettu osoitteesta.

Garcia F., Martin D., Escalera A.d.l., Armingol, J.M. (2017) Sensor Fusion Methodology for Vehicle Detection. IEEE Intelligent Transportation Systems Magazine, 9(1), 123-133.

H. -r. Hu, L. -l. Fang, C. -h. Yang and Y. Zhang, (2020) Research on Development and Countermeasures of Army Ground Unmanned Combat System, 5th International Conference on Information Science, Computer Technology and Transportation (ISCTT), Shenyang, China, 654-657. (access date: 10/03/2023). (access date: 10/03/2023). (access date: 10/03/2023).

Hu, J.-W., et al. (2020) A survey on multi-sensor fusion based obstacle detection for intelligent ground vehicles in off-road environments. Frontiers of Information Technology & Electronic Engineering, 21, 675–692.

Islam F., Nabi M.M., Ball J.E. (2022) Off-Road Detection Analysis for Autonomous Ground Vehicles: A Review. Sensors 22, 8463.

Jie Chen, Jian Sun, Gang Wang (2022) From Unmanned Systems to Autonomous Intelligent Systems, Engineering, 12, 16-19, ISSN 2095-8099.

Kim I.-S., et al. (2023) Vision-Based Activity Classification of Excavators by Bidirectional LSTM. Applied Sciences. 13(1).

Kocic J., Jovičić N., Drndarevic V. (2018) Sensors and sensor fusion in autonomous vehicles. 26th Telecommunications Forum (TELFOR), Belgrade, Serbia, 20-21 November 2018, 420–425.

Li Hulin, et al. (2022) Slim-neck by GSConv: A better design paradigm of detector architectures for autonomous vehicles. Preprint.

Michalski K., Nowakowski M. (2020) The use of unmanned vehicles for military logistic purposes, Economics and Organization of Logistics 5(4), 43-57.

Munir A.F., et al. (2018) Object Modeling from 3D Point Cloud Data for Self-Driving Vehicles. 2018 IEEE Intelligent Vehicles Symposium (IV), Changshu, China, 409-414.

Nahavandi S., et al. (2022) Autonomous Convoying: A Survey on Current Research and Development, IEEE Access, 10, 13663-13683.

Pendleton S.D., et al. (2017) Perception, Planning, Control, and Coordination for Autonomous Vehicles. Machines, 5(1), 6.

Prochowski L., Szwajkowski P., Ziubiński M. (2022) Research Scenarios of Autonomous Vehicles, the Sensors and Measurement Systems Used in Experiments. Sensors, 22, 6586.

Qin J., et al. (2023) Lidar-Based 3D Obstacle Detection Using Focal Voxel R-CNN for Farmland Environment. Agronomy 13(3), 650.

Sedwin T.C., et al. (2022) Conversion of a Quad Bike to an Autonomous Vehicle and Performance Assessment, SAE Technical Paper 2022-28-0590.

Sivaraman S., Trivedi M.M. (2013) Looking at Vehicles on the Road: A Survey of Vision-Based Vehicle Detection, Tracking, and Behavior Analysis. IEEE Transactions on Intelligent Transportation Systems, 14(4), 1773-1795.

Wu, H., Li, W., He, Z., Zhou, Y. (2020) The Design of Military Multifunctional Ground Unmanned Platform. In: Duan, B., Umeda, K., Hwang, W. (eds) Proceedings of the Seventh Asia International Symposium on Mechatronics. Lecture Notes in Electrical Engineering, 588. Springer, Singapore.

Yacoub M., Asfoor M. (2018) Conversion of an All-Terrain Vehicle into a six-channel wire remote controlled UGV, Proceedings of the 18th International Conference on Applied Mechanics and Mechanical Engineering, 1-13.

Zhang J., Yue X., Zhang H., Xiao T. (2022) Optimal Unmanned Ground Vehicle - Unmanned Aerial Vehicle Formation-Maintenance Control for Air-Ground Cooperation. Applied Sciences. 12(7), 3598.