What is it about?

Unmanned surface vehicle is increasingly becoming a research hotspot, which can be used in a variety of civil and military missions. However, compared with the relative maturity of other technologies, the sensing technology of unmanned surface vehicles is relatively weak. Taking "WAM-V-USV" as the research platform, this paper is mainly focus on the detection and tracking methods of moving objects with unmanned surface vehicles. This paper introduces the environment sensing system of unmanned vehicle, water surface image preprocessing, water antenna detection based on SVM, and the method of water surface object detection and tracking based on improved YOLOV3. The simulation results show that the proposed method can effectively improve the accuracy of moving object detection and tracking. Through the practical application in the Songhua River and the US Unmanned Surface Vehicles Open, it is proved that the algorithm has a good detection and tracking effect and meets the real-time requirements. Practice has proved that the object detection and tracking method based on deep learning greatly improves the perception ability and self-security of the unmanned surface vehicles.

Featured Image

Why is it important?

Unmanned surface vehicle is increasingly becoming a research hotspot, which can be used in a variety of civil and military missions. However, compared with the relative maturity of other technologies, the sensing technology of unmanned surface vehicles is relatively weak.

Perspectives

Practice has proved that the object detection and tracking method based on deep learning greatly improves the perception ability and self-security of the unmanned surface vehicles.

Dr zhiyuan chen
The University of Nottingham Malaysia

Read the Original

This page is a summary of: A object detection and tracking method for security in intelligence of unmanned surface vehicles, Journal of Ambient Intelligence and Humanized Computing, October 2020, Springer Science + Business Media,
DOI: 10.1007/s12652-020-02573-z.
You can read the full text:

Read

Resources

Contributors

The following have contributed to this page