Waterline detection in images captured from a moving camera mounted on an autonomous boat is a complex task, due the presence of reflections, illumination changes, camera jitter, and waves. The pose of the boat and the presence of obstacles in front of it can be inferred by extracting the waterline. In this work, we present a supervised method for waterline detection, which can be used for low-cost autonomous boats. The method is based on a Fully Convolutional Neural Network for obtaining a pixel-wise image segmentation. Experiments have been carried out on a publicly available data set of images and videos, containing data coming from a challenging scenario where multiple floating obstacles are present (buoys, sailing and motor boats). Quantitative results show the effectiveness of the proposed approach, with 0.97 accuracy at a speed of 9 frames per second.

Deep Learning Waterline Detection for Low-cost Autonomous Boats

Bloisi Domenico;
2019-01-01

Abstract

Waterline detection in images captured from a moving camera mounted on an autonomous boat is a complex task, due the presence of reflections, illumination changes, camera jitter, and waves. The pose of the boat and the presence of obstacles in front of it can be inferred by extracting the waterline. In this work, we present a supervised method for waterline detection, which can be used for low-cost autonomous boats. The method is based on a Fully Convolutional Neural Network for obtaining a pixel-wise image segmentation. Experiments have been carried out on a publicly available data set of images and videos, containing data coming from a challenging scenario where multiple floating obstacles are present (buoys, sailing and motor boats). Quantitative results show the effectiveness of the proposed approach, with 0.97 accuracy at a speed of 9 frames per second.
2019
978-3-030-01369-1
robotic boats
autonomous navigation
deep learning
robot vision
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/20.500.14090/5763
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
social impact